Collect disk space details from remote computers

In this post I will share a PowerShell script to get the details like, used space, free space, total size, etc. of all the drives on remote computers. The list of remote computers could be fed as a text file to the script.

$CSVFile = "C:\Disk space report.csv"
"Computer Name;Volume Name;Drive Name;Size(GB);Free(GB);Used(GB);Used%(GB)" | out-file -FilePath $CSVFile -append

#domain admin's credentials
$cre = get-credential
$machines = get-content "C:\serverset.txt"
foreach ($machine in $machines) 
{
$disk=Get-WmiObject Win32_LogicalDisk -filter "DriveType=3" -computer $machine -Credential $cre| Select SystemName,DeviceID,VolumeName,Size,freespace
foreach ($d in $disk)
{
#to set language pack for current thread as en-US
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$DeviceID=($d.DeviceID).Replace(":", "")

$VolumeName=$d.VolumeName
#Round(($d.size/1gb), 2)) will round the value to two decimal places
$size=('{0:N2}' -f [Math]::Round(($d.size/1gb), 2))

$freespace=('{0:N2}' -f [Math]::Round(($d.freespace/1gb), 2))
#we have to define the values stored in $d.size and $d.freespace as integer
$usespace=('{0:N2}' -f ([Math]::Round($d.size/1GB,2 -as [int])) - ([Math]::Round($d.freespace/1GB,2 -as [int])))

$usespacepc=('{0:N2}' -f [Math]::Round((($usespace/$size)*100), 2))

$exporttocsv =$machine+";"+$DeviceID+";"+$d.VolumeName+";"+$size+";"+$freespace+";"+$usespace+";"+$usespacepc
$exporttocsv | Out-File -Append $CSVFile
  }
}

Now where on one hand we can get free space quite easily, we need to perform calculations to get used space.

Now we will try to do the same thing, but this time we won’t do any calculation. As a result this can lead to higher speed and accuracy. We’ll try to make use of get-psdrive command to get used space and free space directly. However we still need to use get-wmiobject command to get total size of drive and volumename.

$CSVFile = "C:\Disk space report.csv"
"Computer Name;Drive Name;Volume Name;Size(GB);Free(GB);Used(GB);Used%(GB)" | out-file -FilePath $CSVFile -append
WriteHost"ComputerName;VolumeName;DriveName;Size(GB);Free(GB);Used(GB);Used(GB)"

$cre = get-credential
$machines = get-content "C:\serverset.txt"
foreach ($machine in $machines) {

$disk=Get-WmiObject Win32_LogicalDisk -filter "DriveType=3" -computer $machine -Credential $cre | Select Size,DeviceID,VolumeName
$disk1=Invoke-Command -ComputerName $machine {Get-PSDrive} -Credential $cre | Select-Object PSComputerName,Name,Used,Free,Provider|
where-object Provider -match "FileSystem"

foreach ($d in $disk)
{
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$deviceid=($d.DeviceID).Replace(":", "")

foreach ($d1 in $disk1)
{

if ($d1.Name -match $deviceid){
$VolumeName = $d.VolumeName
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$size=('{0:N2}' -f [Math]::Round(($d.size/1gb), 2))

$freespace=('{0:N2}' -f [Math]::Round(($d1.free/1gb), 2))

$usespace=('{0:N2}' -f [Math]::Round(($d1.used/1gb), 2))

$usespacepc=('{0:N2}' -f [Math]::Round((($usespace/$size)*100), 2))

$exporttocsv =$d1.PSComputerName+";"+$deviceid+";"+$VolumeName+";"+$size+";"+$freespace+";"+$usespace+";"+$usespacepc
$exporttocsv
$exporttocsv| Out-File -Append $CSVFile
Break }
    }
  }
}

Booting from ISO file on remote computer via ILO console on HP Proliant servers

Accessing an ISO file for new OS installation a new HP Proliant server can be tricky if we don’t have manual access to the server or if we intend to do this remotely.

In page number: 237 of the ILO 4 user guide available at HPE iLO 4 User Guide, I found below paragraph, which can be an option to access the ISO image from new physical server.

===============================================================

Using an image file through a URL (IIS/Apache) You can connect scripted media by using the .NET IRC or Java IRC.

Scripted media supports only 1.44 MB floppy disk images (.img) and CD/DVD-ROM images (.iso).

The image must be on a web server on the same network as iLO.

  1. Start the .NET IRC or Java IRC.
  2. Depending on the image type you will use, select Virtual Drives→URL Removable Media (.img) or Virtual Drives→URL CD-ROM/DVD (.iso). The Image file at URL dialog box opens.
  3. Enter the URL for the image file that you want to mount as a virtual drive, and then click Connect. The virtual drive activity LED does not show drive activity for URL-mounted virtual media.

================================================================

Following is the process flow which has been mentioned in the HP blog ILO 4 Scripted Media URL

  1. Use an existing web server, or install a new web server for the purpose of delivering the ISO files

2. Create a folder to hold the ISO images

3. Add ISO file(s)

4. Enable directory browsing in Web Services. You can do this with the IIS manager if          its a Windows web server. If you created a custom folder for the files, enable        directory browsing on that folder.

5. You must add a MIME type for the ISO extension. In Server 2008 IIS, you can do this          from the HTTP Headers selection in IIS Manager.

a> .ISO application/octet-stream

b> .IMG application/octet-stream

6. Login to the ILO target server, and open the remote console

7. At the top of the window, click on Virtual Drives, and then select URL DVD ROM

8. Input the HTTP path to the image file, including the file name. Click connect and it              will mount the drive. Path will resemble “http://hostname or IP/folder/filename.ISO

Now the problem with this approach is network latency, especially when the infrastructure is globally spread.

So what can be solution here. Virtual folders?

lets read below two paragraphs taken from the user guide.

==============================================

Operating system considerations: Virtual Folder

  • Windows—A Virtual Folder appears automatically after Windows recognizes the mounting of the virtual USB device. You can use the folder the same way that you use a locally attached device. Virtual Folders are nonbootable. Attempting to boot from the Virtual Folder might prevent the server from starting.

===============================================

Virtual folders enable you to access, browse to, and transfer files from a client to a managed server. You can mount and dismount a local or networked directory that is accessible through the client. After you create a virtual image of a folder or directory, the server connects to the image as a USB storage device, enabling you to browse to the server and transfer the files from the virtual image to the server.

Using iLO Virtual Media The Virtual Folder is nonbootable and read-only; the mounted folder is static. Changes to the client folder are not replicated in the mounted folder.

=================================================

Apart from this virtual folders also have a size limit of 2 GB.

So we come down to our next option -> Image File CD/DVD-ROM

For this option we first need to copy the ISO file to a temporary server in the network where we are installing the new physical server. From this temporary server, we should access the ILO of new server. And select VIrtual Drives Option as Image File CD/DVD-ROM. We then have to browse to ISO file stored locally on temporary server and mount it. As this time the ISO file is in the same network as the new server, the network latency would be eliminated and access speed would be greatly enhanced.

Getting folder owners’ details from remote computers

In this short article, we would try to get information about owners of folders on remote computers and the permissions that they have on them.

The below script would travel one level down of C: drive and then recursively gather information about all files and subfolders within.

$path = gci “C:\*” -Recurse
GET-ACL “$path”| select path, Owner -expand access| 
select @{n=”Path”;e={$_.Path.replace(“Microsoft.PowerShell.Core\FileSystem::”,””)}}, Owner, IdentityReference, FileSystemRights, AccessControlType, IsInherited|
Export-Csv "C:\folder_owner_output.csv" -NoTypeInformation

This post has been majorly inspired by this blog post Find file / folder owner information using PowerShell or command prompt

Outlook Automation

Although Microsoft does not recommend unattended automation of its office products, we can leverage powershell at automating some common outlook related operations.

Searching for a particular subject line and then downloading its attachments

$date = get-date -format d

$filepath = "C:\attachments\"
$attr = $date+"_report.csv"
$path = Join-Path $filepath $attr

$olFolderInbox = 6
[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$outlook = new-object -com outlook.application;
$ns = $outlook.GetNameSpace("MAPI");
$inbox = $ns.GetDefaultFolder($olFolderInbox)
$messages = $inbox.items

foreach($message in $messages){
$msubject = $message.subject
write-host "Messages Subject: $msubject"

if ($msubject.contains("Daily Report")){
$message.attachments|foreach {
Write-Host $_.filename
Write-Host "Attachment is saved at $path"
$_.saveasfile(($path))}
$TargetFolder = $inbox.Folders.Item('Processed')
$message.Move($TargetFolder)
break
}else{continue}
}

Prior to running above script we must create a folder “Processed” under “Inbox” of Outlook. Also we may have a hard time scheduling this on task scheduler as the account from which the script runs, should be logged in.

Sending Mail

$From = "abc@company.com"
$to = "xyz@company.com"
$Cc = "pqr@company.com"

$Subject = "Daily Report"
$Body = "PFA The Excel Report containing details."
$SMTPServer = "server01.company.corp"

Send-MailMessage -From $From -to $To -Cc $Cc -Subject $Subject `
-Body $Body -SmtpServer $SMTPServer -Attachments $path

write-host "Mail sent with report attached"

In above script we have not made use of COM object. As a result we can use this as unattended script scheduled on task scheduler. Also the execution is fast.

Now the same task can also be done via COM object as can be seen below.

$Username = "MyUserName";
$Password= "MyPassword";

function Send-ToEmail([string]$email, [string]$attachmentpath){

    $message = new-object Net.Mail.MailMessage;
    $message.From = "YourName@gmail.com";
    $message.To.Add($email);
    $message.Subject = "subject text here...";
    $message.Body = "body text here...";
    $attachment = New-Object Net.Mail.Attachment($attachmentpath);
    $message.Attachments.Add($attachment);

    $smtp = new-object Net.Mail.SmtpClient("smtp.gmail.com", "587");
    $smtp.EnableSSL = $true;
    $smtp.Credentials = New-Object System.Net.NetworkCredential($Username, $Password);
    $smtp.send($message);
    write-host "Mail Sent" ; 
    $attachment.Dispose();
 }
Send-ToEmail  -email "reciever@gmail.com" -attachmentpath $path;

The above script has been taken from a solution posted in how to send email with powershell

Getting DHCP failover information and secondary DHCP server information

In this article, we will discuss about a method to get the information regarding DHCP failover configuration in our domain and the corresponding DHCP servers names (Primary and Secondary). Along with DHCP servers Names, we would also get information such as  Scope ID, Subnet mask, Start and End range. All this data collected would then be appended to a csv file.

[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$CurrentDate = Get-Date -format dddd
$CSVFile = "C:\Dhcp_report_$CurrentDate.csv"
$machines = Get-DhcpServerInDC

"DHCPServer,SecondaryDHCPServer,ScopeName,ScopeID,Subnetmask,Startrange,Endrange" |out-file -FilePath $CSVFile -append

foreach ($machine in $machines)
{
$DHCPName = $machine.dnsname
if ($DHCPName -notin @("server01.company.corp","server02.company.corp")){
$AllScopes = Get-DhcpServerv4Scope -ComputerName $DHCPName
foreach ($scope in $AllScopes)
{
$ScopeName = $scope.Name
$ScopeId = $scope.ScopeId
$failoverscope = Get-dhcpserverv4failover -ComputerName $DHCPName -ScopeId $ScopeId -ErrorAction SilentlyContinue
if ($failoverscope) {
$SecDHCP = ($failoverscope.SecondaryServerName)
}
else {
$SecDHCP = "no failover"
}
$Subnetmask = $scope.subnetmask
$Startrange = $scope.startrange
$Endrange = $scope.endrange
$OutInfo = ($DHCPName).Replace(".company.corp","") + "," + ($SecDHCP).Replace(".company.corp","") + "," +$ScopeName + "," + $ScopeId+ "," + $Subnetmask + "," + $Startrange + "," + $Endrange
Write-Host $OutInfo
Add-Content -Value $OutInfo -Path $CSVFile
}
}
}

Editting folder permissions with PowerShell

In this post, we will discuss about modifying folder NTFS permissions without GUI on a Windows platform. Modifying NTFS permissions for a small environment by manual GUI means is understandable, but it can get really hectic and time consuming for a large environment. Now lets have a look at the following Powershell Script that is a step in the direction of automating this task.


$CSVFile = "C:\log.csv"
$machines = get-content "C:\servers.txt"
foreach ($machine in $machines)
{
<# get all network path names of shared folders from another text file, 
then process through each path recursively
sample path file:
   \\$machine\Users
   \\$machine\Departments
Instead of inputting text file, we can also use:
$paths = gi "H:\Users\*","H:\Departments\*" #>
 $paths = (get-content "C:\paths.txt").replace('$machine', $machine)
 foreach ($path in $paths)
  {
<# -----part-1 (disable inheriance)-----
To change access of an object on folder which has been inherited
from parent folder, we need to disable the inheritance first.
So, traverse one level down in each path, then check there only for 
folders, then disable inheritance at that level. #>
   get-childitem -Path $path\* | ? { $_.PsIsContainer } | % {
<# We could have also used $acl = Get-Item $_ | get-acl
But this would not work in cases where the account who is launching 
the script is not an owner of the folder whose permissions are being 
changed. So we need to tell powershell to only look for 'Access' by 
using a piped code structure. #>    
   $acl = (Get-Item $_).GetAccessControl('Access')
<# first arguement ($true) will disable inheritance and 
second arguement ($false) will remove all ACEs inherited  
from parent folder. #>
   $acl.SetAccessRuleProtection($true, $false)
   Set-Acl -path $path -aclObject $acl | Out-Null
<# ----part-2 changing premissions of an account---
To avoid the problem of ownership, here also we first
try to tell powershell that we are only interested in 
accesses of each acl and not anything extra, by using
a loop structure. #>
   $acl2 = Get-Acl -path $path
   $accesses = $acl2.Access
# get ace from each acl recursively
   foreach ($access in $accesses)
    {
     $ids = $access.IdentityReference.Value
     foreach ($id in $ids)
      {
       if ($id -eq "$machine\User")
        {
<# when conditions true, remove all permissions for 
the mentioned account only #>
         $acl2.RemoveAccessRule($access) | Out-Null
        }
      }             
<# provide Read and Execute permission at parent or root folders level 
to the same account #>
     Set-Acl -path $path -aclObject $acl2 | Out-Null
     $permission = $name,'ReadandExecute','ContainerInherit,ObjectInherit','None','Allow'
     $aclrule = New-Object system.security.accesscontrol.filesystemaccessrule -ArgumentList $permission
     $acl.SetAccessRule($aclrule)
     Set-Acl -path $path -aclObject $acl2 | Out-Null
     $OutInfo = $machine+","+$path+","+$aclrule.IdentityReference+","+$aclrule.FileSystemRights
     Write-Host $OutInfo             
     Add-Content -Value $OutInfo -Path $CSVFile                                
    }
  }
}

Here Inheritance flag is set for both container and object whereas Propogation flag is none.

Following is a table which will help in setting propogation and inheritance flags.

    ╔═════════════╦═════════════╦═══════════════════════════════╦════════════════════════╦══════════════════╦═══════════════════════╦═════════════╦═════════════╗
    ║             ║ folder only ║ folder, sub-folders and files ║ folder and sub-folders ║ folder and files ║ sub-folders and files ║ sub-folders ║    files    ║
    ╠═════════════╬═════════════╬═══════════════════════════════╬════════════════════════╬══════════════════╬═══════════════════════╬═════════════╬═════════════╣
    ║ Propagation ║ none        ║ none                          ║ none                   ║ none             ║ InheritOnly           ║ InheritOnly ║ InheritOnly ║
    ║ Inheritance ║ none        ║ Container|Object              ║ Container              ║ Object           ║ Container|Object      ║ Container   ║ Object      ║
    ╚═════════════╩═════════════╩═══════════════════════════════╩════════════════════════╩══════════════════╩═══════════════════════╩═════════════╩═════════════╝

We can speed up things by employing psexec utility for above PowerShell script as discussed in my previous posts. For some reason get-acl and set-acl fail to function for folders on which the current user is not an owner of. Set-acl tries to change the owner of folder which is certainly not the intention in most cases.

For further reading, refer to this technet link: Windows PowerShell Tip of the Week where  [System.Security.AccessControl.FileSystemRights] (a .NET Framework file system enumeration) and security descriptors are discussed in greater detail.

There is another approach to above scenario, which makes use of a powershell module “NTFSsecurity” File System Security PowerShell Module 4.2.3 by Raimund Andree. We can achieve the desired effect by proceeding as follows.


<#First we need to import the module "NTFSSecurity"
If we want the module to be used by all users on the system, 
then it should be #placed under the path: 
C:\Program Files(x86)\WindowsPowerShell\Modules\NTFSSecurity #>
Import-Module C:\Program Files (x86)\WindowsPowerShell\Modules\NTFSSecurity
<#This will list out all the commands which are 
available in this module#>
Get-Command -Module ntfssecurity
<#This will disable inheritance at only one level below the root 
"test" folder. Remove the \* from path, if inheritance is to 
be disabled at all levels inside.#>
get-item "\\server1\test\*" | Disable-NTFSAccessInheritance
<#Now we first get the access of the account whose access we want to 
change, then we remove its access from the folder.
Now we add the desired access of the same account to the folder.This 
is particularly useful when we want to remove the special permissions 
provided to any account, which may present a security risk.#>
Get-Item -Path "\\server1\test" |Get-NTFSAccess -Account Users | Remove-NTFSAccess
Add-NTFSAccess -Path "\\server1\test" -Account Users -AccessRights ReadAndExecute -AppliesTo ThisFolderAndSubfoldersOneLevel

Before proceeding with the solutions presented in this post, do take a full backup of ACls of the folder in question using icacls command. This technet blog How to Back Up and Restore NTFS and Share Permissions  provides all the details required for performing a quick backup and restore of ACLs.

By wracking mind with these and many other solutions sometimes we feel like that sticking to basics does the job best. A simple command line approach is both faster and accurate in a mass folder structure. We can use icacls to do the same stuff what we discussed so far by putting far lesser effort. For this purpose I a sharing this link iCACLS.exe (2003 sp2, Vista+), which has got all the applications of this command covered.

Installing Updates and patches on remote computers by powershell

In this post, we will use PowerShell to install updates and patches on remote computers using PSExec utility. Although this script is not as efficient as a full fledged SCCM or WUSA setup, but it can come handy in scenarios where the task of patching has to be done manually.

There are certain things we need to ensure before proceeding with this method. Firstly, PSExec service must not be in disabled or marked for deletion state in target remote server. Secondly, the account launching the script should have appropriate administrative privileges. Thirdly, execution of scripts must not be disabled in remote computers via group policy.

Step 1> Proceeding with this method, first create a runme file with name, say runme.ps1 as follows:

<#------------Creating log file------------#>
$CSVFile = "C:\patches\servers_status.csv"
<#-----------------------------------------#>

<#-----------Execute PSEXEC----------------#>
$machines = get-content "C:\patches\servers.txt"
ForEach ($machine In $machines) 
{
 if (Test-Connection -ComputerName $machine -Count 2 -ErrorAction SilentlyContinue)
  {
   Write-Host "$machine is up" -ForegroundColor Green & C:\Windows\System32\psexec.exe \\$machine -u COMPANY\admin -p ###### -h -accepteula cmd.exe /c "\\SERVER1\patches$\script.bat" 
   Restart-Computer $machine -Force
   Write-Host "$machine rebooted" -ForegroundColor Red
   $info = "$machine rebooted"
   Add-content -value $info -Path $CSVFile
  }
else
  {
   Write-Host "$machine is down" -ForegroundColor Red
   $info = "$machine is down"
   Add-content -value $info -Path $CSVFile
  }
}
<#----------------------------------------#>

Step 2> Above script will call out to following batch file “Script.bat” over network.

@Echo Off
Net Use T: /delete /yes
Net Use T: \\SERVER1\patches$
CMD /C powershell.exe -ExecutionPolicy Bypass -Command T:\install_patches.ps1
Net Use T: /delete /yes
Exit

Step 3> Above script will map the folder with script as a network drive on remote computer and then call out to following install_patches.ps1 file which will then perform the installation of update packages. Note that now we won’t use network path of folders, instead we will write complete local path. As the script runs locally on the remote computer, this script will offset the delay caused due to network latency.

<#--------Create folder "patches"-----------#>
if (-Not (Test-Path C:\Users\admin\desktop\patches))
{
    md -path C:\Users\admin\desktop\patches
}
Write-Host "CCM folder created, starting copy of install files"
#------------------------------------
Copy installation files from source to 
target server's desktop #>
Copy-Item -path "\\SERVER2\Packages\*" -destination "C:\Users\admin\desktop\patches" -Recurse
#----------------------------------#
<#ensure 100% copy by matching source 
and destination folder size#>
Do 
{
 start-sleep 2
 write-host "copy of patches is in progress"
 $FolderList1 = Get-ChildItem "\\SERVER2\Packages\"
 Foreach($Folder1 in $FolderList1)
  {
   $FolderSize1 = (Get-ChildItem "\\SERVER2\Packages\$Folder1" -Recurse | 
   Measure-Object -property length -sum).Sum/1mb
   write-host "source folder size: $FolderSize1" 
  }
 $FolderList2 = Get-ChildItem "C:\Users\admin\desktop\patches"
 Foreach($Folder2 in $FolderList2)
  {
   $FolderSize2 = (Get-ChildItem`
   "C:\Users\admin\desktop\CCM\$Folder2" -Recurse | 
   Measure-Object -property length -sum).Sum/1mb
   write-host "destination folder size: $FolderSize2"  
  }
} 
until ($FolderSize1 -eq $FolderSize2)
write-host "installation files copied successfully"
#--------------------------------------
Start installation of patches and create log file #>
write-host "installation of patches started"
$dir = (Get-Item -Path "C:\Users\admin\Desktop\patches\*" -Verbose).FullName
Foreach($item in (ls $dir *.cab -Name))
{
 echo $item
 $item = $dir + "\" + $item
 dism /online /Add-Package /PackagePath:$item  /NoRestart | Out-Null
}
Foreach($item in (ls $dir *.msu -Name))
{
 echo $item
 $item = $dir + "\" + $item
 wusa $item /quiet /NoRestart | Out-Null
}
$machinename = hostname
write-host "$machinename patching completed"
$outinfo = $machinename + "," + "patching completed"
Add-content -value $outinfo -path "T:\servers_status.csv"

After successful reboot of all servers, we can run the following script to get the list of hotfixes installed on each remote computer in the past one day and output the same to corresponding server name’s csv file, as follows:

$machines = get-content "C:\patches\servers.txt"
ForEach ($machine In $machines)
{
 get-hotfix -computername $machine | 
 Select HotfixID, Description, InstalledOn |
 Where-Object {$_.installedon -gt (get-date).adddays(-1)}|
 Sort-Object -property installedon -Descending |
 export-csv "C:\patches\$machinename hotfixes.csv" -NoTypeInformation
}

Getting Folder Details from Remote Computers using PowerShell

This is post all about getting details about shared folders from remote computers using few techniques. We can get various information like last modified time, ACLs, etc. But it can get time consuming in a large production environment at the point when we have to calculate folder size.

Method 1: Measure-object

This can work for a small environment, but can take unreasonable long time on large work environment.

$CSVFile = "C:\logs.csv"
$machines = get-content "C:\servers.txt"
"Computer name,Foldername,Modified time,User/Group,Permissions" | 
out-file -FilePath $CSVFile -append
foreach ($machine in $machines) 
{
 $items = Get-ChildItem \\$machine\h$\USER , \\$machine\h$\USERS
 foreach ($item in $items)
  {
<#Check only folders and filter out files#>
   if ($item.Attributes -eq "Directory")            
    {         
<#-----Get folder name--------------------#>
     $FoldName = $item.Name          
     Write-Host $FoldName
<#------Get last modified timestamp-------#>
     $ModTime = $item.lastwritetime          
     Write-Host $ModTime
<#----------Get folder size---------------#>
     $size = ($item | measure Length -sum).sum/1mb
<# We will collect information about ACL where Folder name 
then matches the name of user
Also we can employ any other condition for our purpose here #>
     $ACLs = get-acl $item.fullname | ForEach-Object { $_.Access  }
     Foreach ($ACL in $ACLs)
      {
       if ($ACL.IdentityReference -eq “Company\" + $FoldName)
        {
         $OutInfo = $machine+","+$FoldName+","+$ModTime+","+$ACL.IdentityReference+","+$ACL.FileSystemRights+","+$size
         Write-Host $OutInfo
         Add-Content -Value $OutInfo -Path $CSVFile
        }
      }
    }
  } 
}

Method 2: File system object

It is observed that calculating folder size by using file system object deliver higher speed and more accurate folder sizes.

$fso = New-Object -comobject Scripting.FileSystemObject
$folder = $fso.GetFolder($path)
$size = [math]::round(($folder.size)/1mb , 2 , 1 )

Method 3: PSExec

There are certain things we need to ensure before proceeding with this method. Firstly, PSExec service must not be in disabled or marked for deletion state in target remote server. Secondly, the account launching the script should have appropriate administrative privileges. Thirdly, execution of scripts must not be disabled in remote computers via group policy.

Step 1> Proceeding with this method, first create a runme file with name, say runme.ps1 as follows:

$CSVFile = "C:\shared_folder\logs.csv"
"File Server,P Folder Name,P Folder Size,Last Modified time,`
User Name,NTFS Access" | out-file -FilePath $CSVFile -append
$machines = get-content "C:\shared_folder\servers.txt"
ForEach ($machine In $machines) 
{
 if (Test-Connection -ComputerName $machine -Count 2 -ErrorAction SilentlyContinue)   
  {
   Write-Host "$machine is up" -ForegroundColor Green
<#-----& is an alternative of start-process-------
accepteula switch will bypass dialog of accepting eula agreement #>     
& C:\Windows\System32\psexec.exe \\$machine -u Company\admin -p ### -h -accepteula1 cmd.exe /c "\\server1\shared_folder\Script.bat"  
  }
 else    
  {
   Write-Host "$machine is down" -ForegroundColor Red
   $info = $machine + "," + "is down"
   Add-content -value $info -Path $CSVFile
  }
}

Step 2> Above script will call out to following batch file “Script.bat” over network.

@Echo Off
Net Use T: /delete /yes
Net Use T: \\server1\shared_folder
CMD /C powershell.exe -ExecutionPolicy Bypass -Command T:\Get_ACL.ps1
Net Use T: /delete /yes
Exit

Step 3> Above script will map the folder with script as a network drive on remote computer and then call out to following Get_ACL.ps1 file which is actually responsible for gathering shared folder information. Note that now we won’t use network path of folders, instead we will write complete local path. As the script runs locally on the remote computer, this script will offset the delay caused due to network latency.

$CSVFile = "T:\logs.csv"
$machine = hostname
$Folders = Get-ChildItem h:\USER , h:\USERS


After this we have our usual script where we can employ either method 1 or method 2.

As a concluding remark, this method has highest speed among others mentioned here on this post and should ease off burden of lots of windows administrators.

Powershell script for obtaining DHCP scope usage statistics for all the DHCP servers in the domain

I am presenting in this post, a solution to a common task of a Windows admin, where one has to pull report of DHCP scope usage statistics and the following script will do just the job at hand.

Algorithm:

1> Get the list of all DHCP servers in the domain from Get-DhcpServerInDC command.

2> Store the DHCP servers in a variable $machines.

3> Get the DHCP scopes recursively for each DHCP server stored in $machines, from Get-DhcpServerv4Scope command.

4> Store the DHCP scopes in a variable $AllScopes.

5> Get usage statistics for each DHCP scope recursively, from Get-DhcpServerv4ScopeStatistics command.

6> Store the statistics for a particular scope in a variable $ScopeStats.

7> Append the data collected to a csv file

Script:

$CurrentDate = Get-Date -format dddd
$CSVFile = "C:\Dhcp_$CurrentDate.csv"
$machines = Get-DhcpServerInDC

"DHCPServer,ScopeName,ScopeID,FreeAddresses,AddressesInUse, `
PercentInUse" |out-file -FilePath $CSVFile -append

foreach ($machine in $machines)
{
 $DHCPName = $machine.DnsName
 $AllScopes = Get-DhcpServerv4Scope -ComputerName $DHCPName
 foreach ($scope in $AllScopes) 
    {
     $ScopeName = $scope.Name
     $ScopeId = $scope.ScopeId
     $ScopeStats = Get-DhcpServerv4ScopeStatistics`
     -ScopeId $ScopeId -ComputerName $DHCPName
     $ScopePercentInUse = $ScopeStats.PercentageInUse
     $Addfree = $ScopeStats.AddressesFree
     $AddUse = $ScopeStats.AddressesInUse
     $OutInfo = $DHCPName + "," + $ScopeName + "," + $ScopeId`
     + "," + $Addfree + "," + $AddUse + "," + $ScopePercentInUse
     Write-Host $OutInfo
     Add-Content -Value $OutInfo -Path $CSVFile
    }
}
We can extend this solution to a scenario where we have to obtain average DHCP scope usage statistics for a whole week. I agree that this solution is not the most elegant way to approach the requirement, but we can have the job completed with this script.

Algorithm:

1> Import CSVs containing per day DHCP scope data in array format. 2> Combine the arrays produced, to form a single array of data and output as CSV file. 3> Remove Duplicate columns from the CSV produces in previous step, by assigning headers to each column and selecting the desired columns and output as CSV file. 4> Calculate average of each scope’s statistics from all seven week days and add this data as columns in CSV generated in previous step and output as final CSV file.

Script:

<#---Creating arrays of csv files for each weekday's 
and storing in seperate variables--#>

$csv1 = @(Get-Content "C:\Dhcp_Monday.csv")
$csv2 = @(Get-Content "C:\Dhcp_Tuesday.csv")
$csv3 = @(Get-Content "C:\Dhcp_Wednesday.csv")
$csv4 = @(Get-Content "C:\Dhcp_Thursday.csv")
$csv5 = @(Get-Content "C:\Dhcp_Friday.csv")
$csv6 = @(Get-Content "C:\Dhcp_Saturday.csv")
$csv7 = @(Get-Content "C:\Dhcp_Sunday.csv")

<#------Creating an array with combined csvs 
and giving output in csv-------------#>

$csv = @()
for ($i=0; $i -lt $csv1.Count; $i++) 
{
 $csv += $csv1[$i] + ', ' + $csv2[$i] + ', ' + $csv3[$i]+ ', ' + `
 $csv4[$i] + ', ' + $csv5[$i]+ ', ' + $csv6[$i] + ', ' + $csv7[$i]
}
$csv | Out-File "C:\consolidated.csv" -encoding default

<#---Removing duplicate or unwanted columns 
from combined csv from previous step----#>

$source = "C:\consolidated.csv"
$destination = "C:\consolidated2.csv"
$finaldestination = "C:\consolidatedfinal.csv"

(Import-CSV $source -Header 1,2,3,4,5,6,7,8,9,10,11,12,
13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,`
31,32,33,34,35,36,37,38,39,40,41,42 |
Select "1","2","3","4","5","6","10","11","12","16","17",`
"18","22","23","24","28","29","30","34","35","36","40",`
"41","42" |
 ConvertTo-Csv -NoTypeInformation |
 Select-Object -Skip 2) -replace '"' | Set-Content $destination

<#-----Performing averaging operation on 
selected columns and removing unwanted columns 
and giving final output in csv------#>

(Import-csv $destination -header "DHCP Server","Scope Name",`
"Scope ID",4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,`
21,22,23,24 | select "DHCP Server","Scope Name","Scope ID",`
@{Name="(Average of week) No. of IP Addresses free";`
Expression={[math]::Round(([Decimal]$_.4 + [Decimal]$_.7` 
+ [Decimal]$_.10 + [Decimal]$_.13 + [Decimal]$_.16 +`
[Decimal]$_.19 + [Decimal]$_.22)/7,2)}},`
@{Name="(Average of week) No. of IP Addresses in Use";`
Expression={[math]::Round(([Decimal]$_.5 + [Decimal]$_.8`
+ [Decimal]$_.11 + [Decimal]$_.14 + [Decimal]$_.17 +`
[Decimal]$_.20 + [Decimal]$_.23)/7,2)}},`
@{Name="(Average of week) Percent IP Addresses in Use";`
Expression={[math]::Round(([Decimal]$_.6 + [Decimal]$_.9`
+ [Decimal]$_.12 + [Decimal]$_.15 + [Decimal]$_.18 +`
[Decimal]$_.21 + [Decimal]$_.24)/7,2)}}|
ConvertTo-Csv -NoTypeInformation) -replace '"'|
Set-Content $finaldestination

<#-------Removing unwanted files---------#>

Remove-Item "C:\consolidated.csv"
Remove-Item "C:\consolidated2.csv"

#-------Creating new folder with 
current month's and week's name----------#>

$week = (Get-WmiObject Win32_LocalTime).weekinmonth
$month = get-date -Format MMMM

New-Item -ItemType directory -Path "C:\$month Week$week"

<#-----Moving the dhcp reports corresponding 
to each weekday's to new folder created----#>     

Move-item "C:\Dhcp_Monday.csv" "C:\$month Week$week"
Move-item "C:\Dhcp_Tuesday.csv" "C:\$month Week$week"
Move-item "C:\Dhcp_Wednesday.csv" "C:\$month Week$week"
Move-item "C:\Dhcp_Thursday.csv" "C:$month Week$week"
Move-item "C:\Dhcp_Friday.csv" "C:$month Week$week"
Move-item "C:\Dhcp_Saturday.csv" "C:$month Week$week"
Move-item "C:\Dhcp_Sunday.csv" "C:$month Week$week"
Move-item "C:\consolidatedfinal.csv" "C:$month Week$week"

All thanks to Adam Dimech’s Coding Blog https://code.adonline.id.au/csv-to-xlsx-powershell/,  we can have the final csv report converted to a more productive xlsx file, as follows.

<#----Assign the desired path for 
storing final xlsx file to a variable----#>     

$week = (Get-WmiObject Win32_LocalTime).weekinmonth
$month = get-date -Format MMMM
$csv = "C:\consolidatedfinal.csv" #Location of the source file
$xlsx = "C:\$month Week$week\consolidated.xlsx"  
$delimiter = "," 

<#------Create a new Excel workbook 
with one empty sheet-------------#>

[threading.thread]::CurrentThread.CurrentCulture = 'en-US'
$excel = New-Object -ComObject excel.application 
$excel.DisplayAlerts=$false
$workbook = $excel.Workbooks.Add(1)
$worksheet = $workbook.worksheets.Item(1)

<#-----Build the QueryTables.Add command 
and reformat the data--------------#>

$TxtConnector = ("TEXT;" + $csv)
$Connector = $worksheet.QueryTables.`
add($TxtConnector,$worksheet.Range("A1"))
$query = $worksheet.QueryTables.item($Connector.name)
$query.TextFileOtherDelimiter = $delimiter
$query.TextFileParseType  = 1
$query.TextFileColumnDataTypes = ,1 * $worksheet.Cells.Columns.Count
$query.AdjustColumnWidth = 1

<#------Execute & delete the import query------#>

$query.Refresh()
$query.Delete()

<#------Save & close the Workbook as XLSX-------#>

$workbook.SaveAs($xlsx)
$excel.Quit() 
Thanks to a brilliant module for PowerShell by Warren F http://ramblingcookiemonster.github.io/PSExcel-Intro/ , we can remove inactive dhcp servers’ data from the final xlsx generated. as follows.
Import-XLSX "C:\$month Week$week\consolidated.xlsx"|

Where-Object {($_.'DHCP Server' -notin @("server1.abc.corp",`
"server2.abc.corp"))}|Export-XLSX -Path`
"C:\$month Week$week\$month Week$week dhcp scope report.xlsx" 

<#---Remove unwanted consolidated csv file---#>

Remove-Item "C:\$month Week$week\consolidated.xlsx"