Summer time approaches and the days seem to get just a bit easier! People are starting to catch up on things, delete old emails, and clean out old files. What? Your people don’t automatically delete files that aren’t needed anymore? You are telling me that they expect you to keep and backup everything that they have ever created! It is time to clean out the trash and free up some space. Lucky for us, this whole process becomes super easy when we start deleting old files with PowerShell!
Before we jump to the script, you will need to decide on two things. First, what is the oldest file that you want? We will measure this by the date accessed attribute. For our environment, we delete any file that hasn’t been used for three years. If today is June 1st, we only want files newer than 06/01/2011.
Second, you will need to think about any files to exclude. For example, you might have log files, videos, or records that should never be deleted. If you have files like this, you will need to create a CSV to exclude them.
At the top of your CSV, enter Name. In the lines below, you can either specify folder names, file names, or a mixture. The screenshot above shows a few different examples including some with wildcards. Got your maximum age in mind and your exclusion CSV create? Great – let’s create our PowerShell cleanup script.
Our Deleting Old Files with PowerShell Script
Launch PowerShell ISE and paste the script below. On the second $exclusions line, change the import-csv command so that it points to your new CSV. In the $Directory line, replace \\SERVER\SHARE and specify the share that you want to clean up. In that same line, change the Last Access Time date to match your maximum file age.
# The line below is the path to the exceptions file - one entry per line $exclusions = @() $exclusions = import-csv .\exclusions.csv #The line below deletes everything EXCEPT what is in the exclusion file. Clear-Variable Directory -ErrorAction SilentlyContinue $Directory = Get-ChildItem \\SERVER\SHARE -Recurse | where-object { $_.LastAccessTime -lt "06/01/2011" } | sort Name |select name,fullname,lastaccesstime foreach ($exclusion in $exclusions){ $Exclusion = $exclusion.name $Directory = $Directory | Where-Object {$_.FullName -notlike "*$exclusion*"} } $Directory | export-csv .\Deleted.csv -NoTypeInformation -Append $Directory.fullname | remove-item -Recurse -whatif -Verbose
You can now run this script to test your results. Any file that will be deleted will be recorded in a Deleted.CSV file. This file is created in the current run path of your PowerShell console. When you get ready to run this script in live action mode, remove the -whatif parameter in the very last line. If you have multiple shares to clean up, you can create a new variable name $Shares and have the script walk through each item.
This is how we keep our file hoarders at bay! If you have any suggestions to improve this script, leave a comment below. If you use another method, I would love to hear about it!
Hey Joseph,
can you take a look at my script below.
(instead of creating csv, i just include in the script. I need to keep any folder with ‘data’, hence the *data*)
please advise as it is not working.
$limit = (Get-Date).AddDays(-28)
$Backupfolders = get-childitem -Path “D:\Test”
$exclusions = @(‘*data*’,’*DATA*’)
Clear-Variable Directory -ErrorAction SilentlyContinue
foreach ($folder in $Backupfolders)
{
$InstancePath = $folder.fullname
$Directory = Get-ChildItem -Path $InstancePath -Recurse | where-object { $_.LastWriteTime -lt $limit } | sort Name |select name,fullname,lastaccesstime
foreach ($exclusion in $exclusions){
$Exclusion = $exclusion.name
$Directory = $Directory | Where-Object {$_.FullName -notlike “*$exclusion*”}
}
$Directory.fullname | Remove-Item -recurse -force -confirm:$false
}
oh i forgot to mention that i need to check the subfolder of D:\Test, and loop the script within the subfolders only.(Folder_X)
eg :
D:\Test\Folder_A
D:\Test\Folder_B
D:\Test\Folder_C
Get rid of this line and try it out: $Exclusion = $exclusion.name
Nice article but why not just use File Server Resource Manager (FSRM)?
http://blogs.technet.com/b/filecab/archive/2009/05/11/dealing-with-stale-data-on-file-servers.aspx
At the time, our server was a 2003 box. This solution is better though – I guess I need to revisit this post!
Another great post!
One thing that would be useful as well would be to have the script scan the folders, and if they are now empty, delete the folders as well.
Try this borrowed script out. You will need to set the location at the top. It has a -whatif statement next to the remove-item line.
Set-Location \\SERVER\SHARE
$items = Get-ChildItem -Recurse
foreach($item in $items)
{
if( $item.PSIsContainer )
{
$subitems = Get-ChildItem -Recurse -Path $item.FullName
if($subitems -eq $null)
{
“Remove item: ” + $item.FullName
Remove-Item $item.FullName -WhatIf –
}
$subitems = $null
}
}
And I am glad you liked the post!! Nice comments make them worth writing!
This is a great idea. Where I work I doubt I would ever be able to actually delete the files. When i can find some time, I want to change this up a little and instead of deleting files, move them to a network share. That would not be so practical at a large organization, but we only have 200 folks, and we bend over backwards for them. I have several with less than 5% of hard drive space available, and they never have time to clean up their drives, so this might solve that for us. Thanks for continuing to post all of these great ideas!!!!
Thanks Matt!
How big are those hard drives? They must have a ton of documents!!!