Files and Folders

Summary: in this tutorial, you will learn master file system operations in powershell. navigate directories, create and manipulate files, read and write content, search with select-string, manage permissions, and work with paths.

Files and Folders

File system operations are among the most common automation tasks. Whether you're organizing files, searching for content, processing logs, or managing backups, understanding PowerShell's file system cmdlets is essential.

PowerShell treats the file system as one of many "providers" — the same cmdlets that work with files also work with the registry, environment variables, certificates, and more. This unified approach means once you learn these commands, you can apply them everywhere.

Why PowerShell Excels at File Operations

Traditional command-line file operations involve cryptic commands (ls, cp, mv) and complex syntax for filtering and processing. PowerShell provides:

Consistent verb-noun naming: Get-ChildItem, Copy-Item, Remove-Item — clear, discoverable, self-documenting Object-based output: File objects have properties (.Name, .Length, .LastWriteTime) you can directly access Powerful filtering: Combine with Where-Object, Select-Object, and pipeline operations Rich metadata: Access creation time, attributes, permissions, owner, and more without external tools

Before manipulating files, you need to navigate. PowerShell provides cmdlets that feel familiar to both Windows and Unix users.

Getting Your Current Location

# Show current directory
Get-Location           # PowerShell cmdlet
pwd                    # Alias (Unix-style)
 
# Get just the path string
(Get-Location).Path    # "C:\Users\YourName\Documents"
 

Get-Location returns a PathInfo object, not just a string. This object has a .Path property with the full directory path.

Changing Directories

# Change to specific directory
Set-Location C:\Users          # PowerShell cmdlet
cd C:\Users                    # Alias (universal)
 
# Navigate with shortcuts
Set-Location ~                 # Home directory ($HOME)
Set-Location ..                # Parent directory
Set-Location -                 # Previous directory (PowerShell 7+)
Set-Location \                 # Root of current drive
 
# Change drive
Set-Location D:\
 

Why Set-Location instead of cd? In scripts, explicit cmdlet names improve clarity. cd is convenient for interactive use, but Set-Location is unambiguous.

Stack-Based Navigation (Push/Pop)

PowerShell maintains a location stack, letting you "bookmark" locations and return to them:

# Save current location, navigate elsewhere
Push-Location C:\Windows\System32
Get-ChildItem                      # Work in System32
 
# Do work in another location
Push-Location C:\Logs
Get-ChildItem *.log                # Work in Logs
 
# Return to previous location
Pop-Location                       # Back to System32
Pop-Location                       # Back to original location
 

This is invaluable in scripts that need to change directories temporarily without losing their starting point:

function Process-LogFiles {
    Push-Location C:\Logs
    try {
        # Process files...
        Get-ChildItem *.log | ForEach-Object { Process-Log $_ }
    }
    finally {
        Pop-Location    # Always return to original location
    }
}
 

Listing Files and Directories: Get-ChildItem

Get-ChildItem is PowerShell's most versatile file listing command. It replaces dir (Windows) and ls (Unix) with a single, powerful cmdlet.

Basic Listing

# List current directory
Get-ChildItem                    # PowerShell cmdlet
ls                               # Alias (Unix-style)
dir                              # Alias (DOS-style)
gci                              # Short alias
 
# List specific directory
Get-ChildItem C:\Windows
 
# List specific file pattern
Get-ChildItem *.txt              # Positional parameter
Get-ChildItem -Filter "*.txt"    # Explicit parameter (faster)
 

Files vs Directories

# Files only
Get-ChildItem -File
 
# Directories only
Get-ChildItem -Directory
 
# Count each
(Get-ChildItem -File).Count
(Get-ChildItem -Directory).Count
 

Why this matters: Filtering at the cmdlet level is more efficient than filtering with Where-Object afterward.

Recursive Listing

# List all files in subdirectories
Get-ChildItem -Recurse
 
# Control recursion depth
Get-ChildItem -Recurse -Depth 2      # Only 2 levels deep
 
# Exclude directories from recursion
Get-ChildItem -Recurse -Exclude "node_modules","bin","obj"
 

Performance note: Deep recursive listings can be slow on large directory trees. Use -Depth to limit scope, or -ErrorAction SilentlyContinue to skip permission-denied directories:

Get-ChildItem C:\ -Recurse -Depth 3 -ErrorAction SilentlyContinue
 

Advanced Filtering

# Multiple patterns with -Include (requires -Recurse or *)
Get-ChildItem -Include "*.txt","*.md","*.log" -Recurse
 
# Exclude patterns
Get-ChildItem -Exclude "*.tmp","*.bak" -Recurse
 
# Hidden and system files
Get-ChildItem -Hidden                 # Hidden files only
Get-ChildItem -System                 # System files only
Get-ChildItem -Force                  # Everything (hidden + system)
 
# Specific attributes
Get-ChildItem | Where-Object { $_.Attributes -band [System.IO.FileAttributes]::ReadOnly }
 

Filter vs Include vs Where-Object:

  • -Filter: Fastest (filesystem-level filtering)
  • -Include/-Exclude: Moderate (PowerShell-level)
  • Where-Object: Slowest (post-processing)

Prefer -Filter when possible for performance.

Practical Listing Examples

Files modified in the last 24 hours:

$yesterday = (Get-Date).AddDays(-1)
Get-ChildItem -File -Recurse | Where-Object { $_.LastWriteTime -gt $yesterday }
 

10 largest files:

Get-ChildItem -File -Recurse -ErrorAction SilentlyContinue |
    Sort-Object Length -Descending |
    Select-Object -First 10 Name, @{N='SizeMB';E={[math]::Round($_.Length/1MB,2)}}, DirectoryName |
    Format-Table -AutoSize
 

Find empty directories:

Get-ChildItem -Directory -Recurse |
    Where-Object { (Get-ChildItem $_.FullName -Force -ErrorAction SilentlyContinue).Count -eq 0 } |
    Select-Object FullName
 

Disk usage by top-level folder:

Get-ChildItem -Directory | ForEach-Object {
    $size = (Get-ChildItem $_.FullName -Recurse -File -ErrorAction SilentlyContinue |
        Measure-Object Length -Sum).Sum
    [PSCustomObject]@{
        Folder = $_.Name
        SizeMB = [math]::Round($size / 1MB, 2)
        SizeGB = [math]::Round($size / 1GB, 2)
    }
} | Sort-Object SizeMB -Descending | Format-Table -AutoSize
 

These patterns demonstrate the power of combining Get-ChildItem with pipeline operations.

Creating Files and Directories

Creating Files

# Create empty file
New-Item -ItemType File -Name "readme.txt"
New-Item -ItemType File -Path "C:\Temp\notes.txt"
 
# Create file with content
"Hello World" | Out-File "greeting.txt"
Set-Content "data.txt" -Value "Sample content"
 
# Create file and parent directories if needed
New-Item -ItemType File -Path "docs\guides\setup.md" -Force
 

-Force creates any missing parent directories automatically — no need to create them manually first.

Creating Directories

# Create single directory
New-Item -ItemType Directory -Name "ProjectFiles"
mkdir ProjectFiles                # Alias
 
# Create nested directories
New-Item -ItemType Directory -Path "src\components\layout" -Force
 
# Create multiple directories
"src", "tests", "docs", "build" | ForEach-Object {
    New-Item -ItemType Directory -Name $_ -Force
}
 

Why New-Item for everything? Consistency. One cmdlet handles files, directories, registry keys, symlinks — learning one pattern works everywhere.

Copying, Moving, and Renaming

Copying Files and Directories

# Copy single file
Copy-Item "source.txt" "destination.txt"
 
# Copy to different directory
Copy-Item "report.pdf" "C:\Backup\"
 
# Copy directory recursively
Copy-Item "ProjectFiles" "ProjectBackup" -Recurse
 
# Copy with pattern matching
Copy-Item "*.log" "C:\Logs\Archive\"
 
# Copy and overwrite
Copy-Item "config.json" "C:\App\" -Force
 
# Copy specific file types recursively
Get-ChildItem -Filter "*.jpg" -Recurse | ForEach-Object {
    $dest = $_.FullName -replace 'SourceFolder', 'DestFolder'
    $destDir = Split-Path $dest
    if (!(Test-Path $destDir)) {
        New-Item $destDir -ItemType Directory -Force | Out-Null
    }
    Copy-Item $_.FullName $dest
}
 

Moving Files

# Move file
Move-Item "oldname.txt" "newname.txt"
Move-Item "file.txt" "C:\Archive\"
 
# Move directory
Move-Item "OldFolder" "NewFolder"
 
# Move with overwrite
Move-Item "data.json" "backup\data.json" -Force
 
# Move files matching pattern
Move-Item "*.bak" "C:\Backup\"
 

Move vs Copy + Remove: Move-Item is atomic within the same filesystem. It's faster and safer than copying then deleting.

Renaming Files

# Rename single file
Rename-Item "oldname.txt" "newname.txt"
 
# Rename with regex replacement
Get-ChildItem "*.jpeg" | ForEach-Object {
    Rename-Item $_ ($_.Name -replace '\.jpeg$', '.jpg')
}
 
# Add prefix to files
Get-ChildItem "*.txt" | ForEach-Object {
    Rename-Item $_ "backup_$($_.Name)"
}
 
# Remove pattern from filenames
Get-ChildItem "*_copy.txt" | ForEach-Object {
    $newName = $_.Name -replace '_copy', ''
    Rename-Item $_ $newName
}
 

Deleting Files and Directories

# Delete single file
Remove-Item "temp.txt"
 
# Delete multiple files
Remove-Item "*.tmp"
 
# Delete directory
Remove-Item "OldFolder" -Recurse        # Recurse required for non-empty dirs
 
# Delete with confirmation
Remove-Item "important.docx" -Confirm
 
# Delete without confirmation (dangerous!)
Remove-Item "C:\Logs\*" -Recurse -Force
 
# Delete files older than 30 days
$cutoff = (Get-Date).AddDays(-30)
Get-ChildItem "C:\Logs" -File | Where-Object { $_.LastWriteTime -lt $cutoff } | Remove-Item
 

Remove-Item with -Recurse -Force is permanent

There's no Recycle Bin. Files are immediately deleted. Always test with -WhatIf first:

Remove-Item "C:\Temp\*" -Recurse -Force -WhatIf
 

This shows what would be deleted without actually deleting.

Reading File Content

Reading Text Files

# Read entire file as array of lines
$lines = Get-Content "config.txt"
$lines.Count                  # Number of lines
$lines[0]                     # First line
 
# Read as single string
$content = Get-Content "file.txt" -Raw
 
# Read specific lines
Get-Content "large.log" -TotalCount 10       # First 10 lines
Get-Content "large.log" -Tail 20             # Last 20 lines
 
# Stream large files (memory-efficient)
Get-Content "huge.log" | ForEach-Object {
    if ($_ -match "ERROR") {
        Write-Host $_ -ForegroundColor Red
    }
}
 
# Monitor file in real-time (like tail -f)
Get-Content "app.log" -Wait -Tail 10
 

Raw vs line-by-line: -Raw reads the entire file as a single string, preserving exact formatting. Without -Raw, you get an array of strings (one per line), which is easier for line-by-line processing.

Reading Binary Files

# Read binary content
$bytes = Get-Content "image.png" -AsByteStream    # PowerShell 7+
$bytes = Get-Content "image.png" -Encoding Byte   # PowerShell 5.1
 
# Read first N bytes
$header = Get-Content "data.bin" -AsByteStream -TotalCount 100
 

Writing File Content

Writing Text Files

# Overwrite file
"Hello World" | Out-File "greeting.txt"
Set-Content "data.txt" -Value "New content"
 
# Append to file
"New line" | Out-File "log.txt" -Append
Add-Content "log.txt" -Value "Another line"
 
# Write array of lines
$lines = @("Line 1", "Line 2", "Line 3")
$lines | Out-File "output.txt"
Set-Content "output.txt" -Value $lines
 
# Write with specific encoding
"Ünïcödé" | Out-File "unicode.txt" -Encoding UTF8
 
# Write without newline
"Text" | Out-File "file.txt" -NoNewline
 

Out-File vs Set-Content vs Add-Content:

  • Out-File: Converts objects to text representation, then writes
  • Set-Content: Writes strings directly (overwrites)
  • Add-Content: Writes strings directly (appends)

For simple text writing, Set-Content is clearest.

Efficient Large File Writing

# Use StringBuilder for performance
$sb = [System.Text.StringBuilder]::new()
1..10000 | ForEach-Object {
    [void]$sb.AppendLine("Line $_")
}
$sb.ToString() | Out-File "large.txt"
 
# Or use .NET StreamWriter
$stream = [System.IO.StreamWriter]::new("large.txt")
try {
    1..10000 | ForEach-Object {
        $stream.WriteLine("Line $_")
    }
}
finally {
    $stream.Close()
}
 

Searching File Content: Select-String

Select-String is PowerShell's equivalent of grep — search for patterns in files:

# Search single file
Select-String -Path "app.log" -Pattern "ERROR"
 
# Search multiple files
Select-String -Path "*.txt" -Pattern "TODO"
 
# Case-sensitive search
Select-String -Path "data.txt" -Pattern "Error" -CaseSensitive
 
# Search recursively
Select-String -Path "src\*.cs" -Pattern "TODO" -Recurse
 
# Show context lines
Select-String -Path "app.log" -Pattern "exception" -Context 2,3
# Shows 2 lines before and 3 lines after each match
 
# Count matches
(Select-String -Path "*.txt" -Pattern "keyword").Count
 
# Get just the matching lines
Select-String -Path "log.txt" -Pattern "ERROR" | ForEach-Object { $_.Line }
 
# Search for multiple patterns
Select-String -Path "*.cs" -Pattern "TODO|FIXME|HACK"
 
# Exclude files from search
Get-ChildItem -Recurse -Include "*.txt" -Exclude "*test*" |
    Select-String -Pattern "keyword"
 

Output format: Select-String returns MatchInfo objects with properties:

  • .Filename: Name of the file
  • .Path: Full path
  • .LineNumber: Line number where match was found
  • .Line: The full line containing the match
  • .Pattern: The search pattern
  • .Matches: The regex match objects

Testing Path Existence

# Test if path exists
Test-Path "C:\Windows"               # True
Test-Path "C:\DoesNotExist"          # False
 
# Test specific type
Test-Path "readme.txt" -PathType Leaf       # File
Test-Path "C:\Windows" -PathType Container  # Directory
 
# Test multiple paths
@("file1.txt", "file2.txt", "file3.txt") | ForEach-Object {
    [PSCustomObject]@{
        File = $_
        Exists = Test-Path $_
    }
} | Format-Table
 

Working with Paths

# Combine paths (handles separators correctly)
Join-Path "C:\Users" "Documents\file.txt"
Join-Path $HOME "Desktop" "notes.txt"
 
# Get parts of a path
Split-Path "C:\Users\Alice\Documents\report.pdf" -Parent     # C:\Users\Alice\Documents
Split-Path "C:\Users\Alice\Documents\report.pdf" -Leaf       # report.pdf
Split-Path "C:\Users\Alice\Documents\report.pdf" -Extension  # .pdf
 
# Convert relative to absolute
Resolve-Path ".\file.txt"         # Full path
Resolve-Path "..\*.txt"            # Supports wildcards
 
# Get relative path
Resolve-Path "C:\Windows\System32" -Relative    # Relative to current location
 

File and Directory Properties

PowerShell file objects are rich with metadata:

$file = Get-Item "document.pdf"
 
# Basic properties
$file.Name               # Filename
$file.FullName           # Full path
$file.Extension          # .pdf
$file.Length             # Size in bytes
$file.Directory          # Parent directory object
$file.DirectoryName      # Parent directory path
 
# Timestamps
$file.CreationTime
$file.LastWriteTime
$file.LastAccessTime
 
# Attributes
$file.Attributes         # ReadOnly, Hidden, System, Archive, etc.
$file.IsReadOnly         # Boolean
 
# Check specific attributes
if ($file.Attributes -band [System.IO.FileAttributes]::Hidden) {
    "File is hidden"
}
 

Modifying Attributes

# Set read-only
Set-ItemProperty "file.txt" -Name IsReadOnly -Value $true
 
# Remove read-only
Set-ItemProperty "file.txt" -Name IsReadOnly -Value $false
 
# Hide file
(Get-Item "secret.txt").Attributes += 'Hidden'
 
# Unhide file
$file = Get-Item "secret.txt" -Force
$file.Attributes = $file.Attributes -band (-bnot [System.IO.FileAttributes]::Hidden)
 
# Update timestamps
$file = Get-Item "old.txt"
$file.LastWriteTime = Get-Date
$file.CreationTime = (Get-Date).AddYears(-1)
 

Exercises

🏋️ Exercise 1: File Organization Script

Create a script that organizes files in a directory by extension:

  • Scan a specified directory
  • Create subdirectories for each file extension
  • Move files into their respective extension folders
  • Handle files without extensions
Show Solution
function Organize-FilesByExtension {
    [CmdletBinding(SupportsShouldProcess)]
    param(
        [Parameter(Mandatory)]
        [ValidateScript({ Test-Path $_ -PathType Container })]
        [string]$Path
    )
 
    Write-Host "Organizing files in: $Path" -ForegroundColor Cyan
 
    # Get all files
    $files = Get-ChildItem -Path $Path -File
 
    # Group by extension
    $fileGroups = $files | Group-Object Extension
 
    foreach ($group in $fileGroups) {
        # Determine folder name
        if ($group.Name) {
            $folderName = $group.Name.TrimStart('.')
        } else {
            $folderName = "NoExtension"
        }
 
        $destFolder = Join-Path $Path $folderName
 
        # Create destination folder
        if (-not (Test-Path $destFolder)) {
            if ($PSCmdlet.ShouldProcess($destFolder, "Create directory")) {
                New-Item $destFolder -ItemType Directory | Out-Null
                Write-Host "Created folder: $folderName" -ForegroundColor Green
            }
        }
 
        # Move files
        foreach ($file in $group.Group) {
            $dest = Join-Path $destFolder $file.Name
            if ($PSCmdlet.ShouldProcess($file.Name, "Move to $folderName")) {
                Move-Item $file.FullName $dest -Force
                Write-Verbose "Moved $($file.Name) to $folderName"
            }
        }
    }
 
    Write-Host "Organization complete!" -ForegroundColor Green
}
 
# Test
Organize-FilesByExtension -Path "C:\Users\Public\Downloads" -Verbose -WhatIf
 
🏋️ Exercise 2: Duplicate File Finder

Create a function that finds duplicate files based on content (file hash):

  • Hash all files in directory
  • Group by hash
  • Report duplicates with size and locations
Show Solution
function Find-DuplicateFiles {
    [CmdletBinding()]
    param(
        [Parameter(Mandatory)]
        [string]$Path,
 
        [switch]$Recurse
    )
 
    Write-Host "Scanning for duplicates in: $Path" -ForegroundColor Cyan
 
    # Get all files
    $params = @{
        Path = $Path
        File = $true
        ErrorAction = 'SilentlyContinue'
    }
    if ($Recurse) { $params.Recurse = $true }
 
    $files = Get-ChildItem @params
 
    Write-Host "Found $($files.Count) files. Computing hashes..." -ForegroundColor Yellow
 
    # Compute hash for each file
    $hashedFiles = $files | ForEach-Object {
        Write-Progress -Activity "Hashing files" -Status $_.Name
        $hash = (Get-FileHash $_.FullName -Algorithm MD5).Hash
        [PSCustomObject]@{
            Path = $_.FullName
            Name = $_.Name
            SizeMB = [math]::Round($_.Length / 1MB, 2)
            Hash = $hash
        }
    }
 
    Write-Progress -Activity "Hashing files" -Completed
 
    # Find duplicates
    $duplicates = $hashedFiles | Group-Object Hash | Where-Object { $_.Count -gt 1 }
 
    if ($duplicates.Count -eq 0) {
        Write-Host "No duplicates found!" -ForegroundColor Green
        return
    }
 
    Write-Host "`nFound $($duplicates.Count) sets of duplicates:" -ForegroundColor Red
 
    foreach ($group in $duplicates) {
        Write-Host "`nDuplicate Set (Hash: $($group.Name.Substring(0,8))...):" -ForegroundColor Yellow
        Write-Host "  Size: $($group.Group[0].SizeMB) MB" -ForegroundColor White
        Write-Host "  Copies:" -ForegroundColor White
        $group.Group | ForEach-Object {
            Write-Host "    $($_.Path)" -ForegroundColor Gray
        }
    }
 
    # Return for further processing
    return $duplicates
}
 
# Test
Find-DuplicateFiles -Path "C:\Users\Public\Pictures" -Recurse
 
🏋️ Exercise 3: Log File Cleaner

Create a script that cleans old log files:

  • Remove .log files older than specified days
  • Archive logs between threshold days to .zip
  • Report space freed
Show Solution
function Clean-LogFiles {
    [CmdletBinding(SupportsShouldProcess)]
    param(
        [Parameter(Mandatory)]
        [string]$LogPath,
 
        [int]$DeleteOlderThanDays = 90,
        [int]$ArchiveOlderThanDays = 30,
        [string]$ArchivePath = "$LogPath\Archive"
    )
 
    if (-not (Test-Path $LogPath)) {
        throw "Log path does not exist: $LogPath"
    }
 
    $deleteCutoff = (Get-Date).AddDays(-$DeleteOlderThanDays)
    $archiveCutoff = (Get-Date).AddDays(-$ArchiveOlderThanDays)
 
    $logFiles = Get-ChildItem -Path $LogPath -Filter "*.log" -File
    $spaceFreed = 0
 
    Write-Host "Processing $($logFiles.Count) log files..." -ForegroundColor Cyan
 
    # Files to delete (older than delete threshold)
    $deleteFiles = $logFiles | Where-Object { $_.LastWriteTime -lt $deleteCutoff }
 
    if ($deleteFiles) {
        Write-Host "`nDeleting $($deleteFiles.Count) files older than $DeleteOlderThanDays days..." -ForegroundColor Red
        foreach ($file in $deleteFiles) {
            $size = $file.Length
            if ($PSCmdlet.ShouldProcess($file.Name, "Delete")) {
                Remove-Item $file.FullName -Force
                $spaceFreed += $size
                Write-Verbose "Deleted: $($file.Name)"
            }
        }
    }
 
    # Files to archive (between archive and delete thresholds)
    $archiveFiles = $logFiles | Where-Object {
        $_.LastWriteTime -lt $archiveCutoff -and $_.LastWriteTime -ge $deleteCutoff
    }
 
    if ($archiveFiles) {
        if (-not (Test-Path $ArchivePath)) {
            New-Item $ArchivePath -ItemType Directory -Force | Out-Null
        }
 
        $archiveName = "logs_$(Get-Date -Format 'yyyy-MM-dd_HH-mm-ss').zip"
        $archiveFullPath = Join-Path $ArchivePath $archiveName
 
        Write-Host "`nArchiving $($archiveFiles.Count) files to $archiveName..." -ForegroundColor Yellow
 
        if ($PSCmdlet.ShouldProcess($archiveName, "Create archive")) {
            Compress-Archive -Path $archiveFiles.FullName -DestinationPath $archiveFullPath
 
            # Delete archived files
            foreach ($file in $archiveFiles) {
                $spaceFreed += $file.Length
                Remove-Item $file.FullName -Force
                Write-Verbose "Archived and removed: $($file.Name)"
            }
        }
    }
 
    # Report
    Write-Host "`n=== Cleanup Summary ===" -ForegroundColor Green
    Write-Host "Files deleted: $($deleteFiles.Count)"
    Write-Host "Files archived: $($archiveFiles.Count)"
    Write-Host "Space freed: $([math]::Round($spaceFreed / 1MB, 2)) MB"
}
 
# Test
Clean-LogFiles -LogPath "C:\Logs" -DeleteOlderThanDays 90 -ArchiveOlderThanDays 30 -Verbose -WhatIf
 

Summary

PowerShell's file system cmdlets provide consistent, powerful operations:

  • Get-ChildItem: List files with rich filtering options
  • Set-Location/Push-Location/Pop-Location: Navigate efficiently
  • New-Item: Create files and directories
  • Copy-Item/Move-Item/Rename-Item: Manipulate files
  • Remove-Item: Delete with care
  • Get-Content/Set-Content/Add-Content: Read and write files
  • Select-String: Search file content with regex
  • Test-Path: Check existence
  • Path cmdlets: Join, split, and resolve paths

Master these cmdlets to automate file organization, content processing, and system maintenance tasks efficiently.

Was this page helpful?
SR

Written by the ShellRAG Team

The ShellRAG editorial team writes practical, beginner-friendly PowerShell tutorials with tested code examples and real-world use cases. Every article is technically reviewed for accuracy and updated regularly.

Learn more about us →