How to Connect to Remote Containers via PowerShell

Wouldn’t it be a novel idea if you could use a Docker container within Visual Studio Code? Imagine the possibilities for development and testing. This functionality is one of the many features that make Visual Studio Code a versatile and powerful code editor. Its extensibility allows users to perform tasks that would otherwise require heavy tooling. Among these features is the ability to access and manage remote containers using PowerShell directly within VS Code.

So, how do you accomplish this, and why would you want to? In this series, we will delve into the practical and conceptual sides of integrating Docker containers with PowerShell in VS Code. This part will focus on understanding the need for containerized environments, PowerShell’s evolution, and the context around using containers for PowerShell development.

Why Use Docker Containers in VS Code?

The integration of Docker containers into development workflows can vastly improve productivity, consistency, and portability. With containers, developers can:

  • Create isolated environments for testing and development
  • Avoid conflicts with software versions on the host system.
  • Reproduce environments that closely resemble production.
  • Share setups easily with teams through Dockerfiles and images

When using VS Code, Docker containers can be accessed and managed through extensions and plugins that allow direct interaction with containerized environments. By combining this with PowerShell, developers and administrators gain a flexible, powerful shell environment within a reproducible container.

Why Use Docker Containers in VS Code?

The software development landscape has changed drastically over the past decade. Tools like Docker and Visual Studio Code (VS Code) are at the forefront of this shift, empowering developers to build and test applications across diverse environments with minimal overhead. Among the many integrations VS Code offers, its support for Docker containers is perhaps one of the most powerful and transformative.

But what exactly makes using Docker containers in VS Code such a game-changer? Why should developers and system administrators consider this setup over more traditional workflows? In this article, we’ll explore the practical and strategic reasons for combining Docker and VS Code, touching on portability, isolation, version control, cross-platform development, team collaboration, and automation.

The Challenge of Local Development Environments

Before Docker and container-based development became widespread, setting up a local environment often meant configuring system dependencies, downloading SDKs, and manually aligning versions of frameworks or tools to match production. This process was fragile and time-consuming. Minor mismatches between environments could introduce elusive bugs or break functionality altogether.

For PowerShell developers, for instance, discrepancies between PowerShell Core and the version of Windows PowerShell installed on a machine might affect script behavior. Even minor updates to .NET components could change how modules function. Ensuring consistency required virtual machines or maintaining rigid workstation policies, both of which lacked flexibility.

The Challenge of Local Development Environments

Developing software has always required a local development environment. For decades, this has meant installing the appropriate tools, libraries, runtimes, and packages onto a developer’s computer and configuring them in a way that mimics the intended deployment environment. While this approach seems logical, it introduces numerous challenges, particularly as systems grow in complexity and teams become more distributed. In this article, we will explore the multifaceted challenges of local development environments, why they persist, and what developers and organizations can do to mitigate them.

Configuration Inconsistencies

One of the most common issues developers face is inconsistency between local development environments and production systems. This discrepancy can arise from differences in operating systems, versions of programming languages, installed libraries, environmental variables, and system dependencies. For instance, a developer might be using macOS with Python 3.9 while the production server runs Ubuntu with Python 3.10. This difference can lead to subtle bugs, performance issues, or complete application failures.

Furthermore, individual development machines may not have access to proprietary or internal systems that are available in production. This disparity makes testing and debugging difficult and increases the risk that code will fail when deployed. Configuration drift, where two once identical environments evolve in different directions, is another cause of inconsistencies that hinder reliable development.

Dependency Management and Clashes

As modern applications grow in complexity, they often rely on numerous third-party packages, frameworks, and SDKs. Managing these dependencies locally introduces significant risk. Conflicts can occur when two projects require different versions of the same library, especially if the libraries are installed globally on the developer’s system.

Language-specific package managers, such as npm for Node.js, pip for Python, and NuGet for .NET, have made dependency management easier, but they don’t fully eliminate the challenges. Global dependencies, environment path settings, and shared system tools can all clash, causing broken builds and failed tests. Additionally, upgrades to packages can introduce breaking changes, especially when dependencies are not tightly versioned.

Environment-Specific Bugs

Applications that work perfectly in a development environment but fail in production are a common headache for developers. These environment-specific bugs often stem from differences in hardware, operating systems, file system case sensitivity, locale settings, network configurations, or available resources. For example, a script that runs flawlessly on a case-insensitive file system (like Windows) might crash on a Linux server due to a misnamed file.

Reproducing these bugs locally can be extremely difficult. Developers may spend hours or even days attempting to replicate an issue, often without success. These elusive bugs erode confidence in the development process and consume valuable time that could be spent on feature development or optimization.

Time-Consuming Onboarding

Bringing new developers onto a project should be a straightforward process, but the complexity of local environments can make it a drawn-out affair. Developers often receive a lengthy setup guide filled with instructions: install this SDK, configure that environment variable, download these data files, and so on. Each step introduces an opportunity for human error.

The onboarding process is further complicated by a lack of standardization. Developers may have different operating systems, preferences for tools and editors, and pre-existing environments that conflict with the project requirements. As a result, it can take days for a new team member to become productive, reducing the overall efficiency of the team.

Maintenance Overhead

Once a local development environment is set up, it requires ongoing maintenance. Operating system updates can break compatibility with older tools. Package updates may deprecate functions or introduce new bugs. Tools installed globally might get overwritten by other projects, leading to confusion and failures.

Maintaining parity between the development, testing, staging, and production environments is a constant struggle. Even with careful documentation, minor discrepancies can creep in over time, especially when multiple developers are working on the same codebase. This leads to an increased burden on the development and DevOps teams, who must constantly troubleshoot environment-related issues.

Lack of Reproducibility

Reproducibility is a core principle of scientific computing and data science, and it’s equally important in software development. A developer should be able to clone a repository, follow a standard process, and expect the application to work the same way every time. Unfortunately, local environments are notoriously non-reproducible.

Manual setup steps, undocumented dependencies, system-specific tweaks, and assumptions about installed tools all contribute to a brittle environment. When bugs are reported, developers often find themselves asking, “What version of X are you using?” or “Can you send me your config file?” This variability slows down collaboration and makes debugging a team-wide endeavor instead of an isolated task.

Limited Scalability and Testing

Local environments are limited by the resources of the individual developer’s machine. Testing distributed systems, performance under load, or high availability setups often requires multiple services running simultaneously. This kind of testing is not always feasible on a laptop or desktop, leading to gaps in the testing process.

In many cases, developers resort to mocking services or simulating behavior, which can be helpful but doesn’t provide the full picture. Integration testing, security validation, and failover simulations require more robust environments, which are often unavailable locally.

Cross-Platform Development Challenges

Developers working on applications that are deployed across different operating systems face additional hurdles. A feature that works on Windows might behave differently on macOS or Linux. These differences are not just superficial; they can be deeply rooted in system APIs, file path conventions, and default behaviors.

Testing across multiple platforms requires either physical access to each platform or the use of virtual machines and emulators. Both options increase the overhead for the developer and slow down the feedback loop. This becomes even more complex when developing mobile apps or embedded systems, which require platform-specific toolchains.

Security Risks

Installing and configuring a local development environment often involves lowering security barriers. Developers may disable firewalls, install outdated packages, or grant elevated privileges to scripts and processes to get things working. These practices can introduce significant security risks.

In some cases, sensitive data must be accessed locally for testing purposes, which increases the risk of accidental exposure. Additionally, installing unknown or unverified dependencies from the internet opens the door to supply chain attacks. The more complex and fragile the local environment, the harder it becomes to ensure its security.

Inconsistent Tooling and Processes

When each developer sets up their environment, differences in tooling and processes can emerge. One developer might use Docker, another Vagrant, and a third might install everything natively. Some may use Bash scripts, while others prefer Makefiles or PowerShell.

This inconsistency can lead to confusion and wasted time, especially when trying to reproduce a colleague’s bug or follow a shared workflow. Unified tooling and standardized development environments help eliminate these issues, but achieving that consistency across a team is easier said than done.

The Case for Environment Abstraction

Given all these challenges, it’s clear that local development environments are inherently fragile and problematic. One solution is to abstract the environment entirely, using containers, virtual machines, or remote development environments that can be centrally managed and version-controlled.

Containers, particularly with tools like Docker and Kubernetes, provide isolated, reproducible environments that behave the same way on any system. Developers can define their entire environment in a Dockerfile, share it with their team, and know that everyone is working in the same setup. Changes to the environment can be tracked in version control, making updates transparent and collaborative.

Remote development environments, such as those provided by GitHub Codespaces or Visual Studio Code Remote Containers, take this a step further. Developers can connect to a predefined container running in the cloud or on a local Docker instance, eliminating the need to install anything beyond their editor.

Infrastructure as Code for Development Environments

The concept of “Infrastructure as Code” (IaC) is commonly applied to production systems, but it’s equally powerful for development. Tools like Terraform, Ansible, and Docker Compose can be used to define and provision development environments, ensuring consistency across teams.

By treating the development environment as code, teams can review changes, rollback updates, and document assumptions. This practice reduces onboarding time, minimizes bugs related to environment drift, and promotes a culture of transparency and collaboration.

Enter Docker: A Containerized Development Approach

Docker solves many of these issues by encapsulating applications and their dependencies in lightweight, portable containers. A container bundles everything an application needs to run, from runtime libraries and configurations to environmental variables. It behaves consistently regardless of the host system, eliminating the age-old problem of “it works on my machine.”

When paired with VS Code, containers become even more powerful. VS Code’s Remote Development extension allows you to run your editor inside a containerized environment. Your source code lives inside the container, and so do the shell, debugging tools, compilers, and extensions. Yet the experience is seamless—VS Code runs locally but feels as if it’s embedded in the container.

Benefits of Using Docker Containers in VS Code

1. Environment Consistency

One of the most important advantages of using Docker containers in VS Code is environmental consistency. Your development container mirrors the production environment closely, right down to the OS version and library configurations.

This is especially crucial when dealing with infrastructure scripting, like PowerShell modules, where small environmental differences can have a big impact. By working inside a container that replicates your server setup (e.g., Ubuntu 18.04, PowerShell 7.2), you ensure that the script behaves the same way during testing as it will in deployment.

2. Isolation of Dependencies

Running PowerShell or any other software inside a Docker container prevents dependency clashes. You can use one container with PowerShell Core 7.2 and another with 7.4, testing scripts against both simultaneously without polluting your host system. Need different versions of AWS CLI, Azure modules, or .NET libraries? No problem—just pull a different image or configure the container accordingly.

This is invaluable when developing for multiple clients or environments. Each container is its isolated bubble, immune to the issues caused by global package installations or OS-level updates.

3. Rapid Onboarding for New Developers

Docker containers, when used with VS Code, significantly reduce the time it takes for a new developer to get started on a project. Instead of sharing setup instructions or troubleshooting local installation issues, you can simply provide a Dockerfile and a .devcontainer configuration.

The new team member can clone the repo, open it in VS Code, and be coding in a reproducible environment within minutes. This removes friction from team scaling and ensures that everyone is developing under the same assumptions.

4. Cross-Platform Development

VS Code is available on Windows, macOS, and Linux. Docker containers also run on all these platforms (with some help from WSL on Windows). By using containers, you abstract away differences between host operating systems. A Mac user and a Windows user can both develop on a Linux-based container using the same toolset and configuration.

This is particularly useful in projects targeting Linux servers or cloud environments, which may behave differently from local Windows machines. With Docker and VS Code, you’re developing directly in a Linux environment, regardless of your host.

5. Streamlined Testing and Debugging

VS Code supports interactive debugging, terminal access, and task automation within the container. When paired with Docker, you can run your tests directly inside the container, automate builds with Docker Compose, and even spin up supporting services like databases for integration testing.

PowerShell developers, for instance, might test scripts that query REST APIs or connect to SQL Server. Using Docker Compose, they can bundle these services together and simulate the complete application stack—all while writing and debugging from inside VS Code.

6. CI/CD Integration and DevOps Readiness

Using containers during development aligns perfectly with CI/CD pipelines. Whether you’re using GitHub Actions, Azure DevOps, or Jenkins, these platforms support container-based builds and deployments.

If your development takes place inside a Docker container configured in VS Code, transitioning to automated builds becomes straightforward. The same Dockerfile used locally can be used in production pipelines, ensuring zero environment drift.

7. Reusability and Modularity

Containers promote a modular way of thinking. You can maintain different Docker images for different use cases: one for testing PowerShell scripts against AWS, another for building modules for internal teams, and another for interacting with Azure resources. Each image becomes a reusable component that’s easy to share across teams or projects.

VS Code lets you define these containers as reusable development environments using .devcontainer.json.json.json.json.json.json. You can define extensions, settings, tasks, and mounts—everything your project needs—right in version control.

8. Security and Risk Mitigation

Developers often need to test third-party modules or run scripts from unfamiliar sources. Running these inside a container provides a security boundary. If a script misbehaves, it can’t affect your host system. You simply destroy the container and start fresh.

This capability is especially helpful when testing potentially harmful operations like registry edits, file deletions, or system reconfigurations. You can even disable network access or file system mounts for high-risk experiments.

9. Cloud Development Enablement

As development shifts increasingly to the cloud, the ability to work inside remote containers becomes essential. VS Code supports connecting to containers running on virtual machines or cloud services via SSH or Docker contexts. This means you can run VS Code on your laptop, but edit and debug files on a container in the cloud.

Developers working with cloud-native technologies or Kubernetes will find this approach especially useful. The same container can be deployed to Azure Container Instances, AWS ECS, or Google Cloud Run, with development, testing, and debugging all occurring in the same environment.

Practical Example: PowerShell in VS Code with Docker

Let’s look at a concrete scenario. Suppose you’re developing a PowerShell script that interfaces with AWS APIs. You need:

  • PowerShell 7.2
  • AWS Tools for PowerShell
  • A Linux-based image to mimic your production system

Here’s what your Dockerfile might look like:

FROM mcr.microsoft.com/powershell:7.2-ubuntu-20.04

RUN pwsh -Command “Install-Module -Name AWS.Tools.Common -Force”

RUN pwsh -Command “Install-Module -Name AWS.Tools.S3 -Force”

WORKDIR /workspace

In your VS Code .devcontainer/devcontainer.json file, you would define:

{

  “name”: “PowerShell AWS Dev”,

  “dockerFile”: “Dockerfile”,

  “extensions”: [“ms-vscode.PowerShell”],

  “settings”: {

    “terminal.integrated.shell.linux”: “/usr/bin/pwsh”

  },

  “mounts”: [“source=${localWorkspaceFolder},target=/workspace,type=bind”]

}

Now, when you open the project in VS Code, the environment is pre-loaded with all necessary tools. You can write scripts, authenticate to AWS, and test S3 interactions—all without installing anything natively on your machine.

Conclusion

The combination of Docker containers and Visual Studio Code represents a major leap forward in modern software development. It addresses long-standing challenges around consistency, scalability, and cross-platform support, while enabling powerful new workflows that align with today’s DevOps and CI/CD practices.

For PowerShell developers, or anyone working in dynamic, cloud-integrated environments, this pairing offers a robust, flexible, and secure development experience. Whether you’re building internal modules, developing cloud automation, or managing infrastructure as code, containers in VS Code help you work faster, safer, and with greater confidence.

By using Docker containers in VS Code, you not only enhance your productivity but also align your development practices with modern engineering standards, paving the way for smoother deployments, cleaner codebases, and happier teams.

Evolution of PowerShell

PowerShell began its life as a Windows-only shell, designed for system administrators who needed an effective and extensible tool to manage systems. It provided powerful scripting capabilities and deep integration with Windows and Microsoft applications.

In 2016, Microsoft changed the trajectory of PowerShell with the introduction of PowerShell Core. Built on .NET Core, PowerShell Core was designed to be:

  • Cross-platform, supporting Windows, Linux, and macOS
  • Open source, allowing community involvement
  • Modular, to support faster development and deployment cycles

This move was significant because it opened the door for developers and administrators to use PowerShell outside of the traditional Windows ecosystem. While PowerShell Core lacks some of the deep integration features of Windows PowerShell (due to proprietary dependencies), it brings a level of portability that is essential in modern DevOps workflows.

Compatibility Considerations

Not all scripts written for Windows PowerShell are compatible with PowerShell Core. There are differences in available modules, cmdlets, and system integrations. As a result, it becomes important to test scripts in different versions of PowerShell.

This need for compatibility testing is one of the main reasons to use containers. With Docker, you can:

  • Test scripts across multiple versions of PowerShell
  • Recreate issues in isolated environments.
  • Minimize risks before deploying updates or changes.

For example, if an update to PowerShell Core introduces changes that break a script, having a Docker container with the previous version allows quick validation and debugging.

Why Use PowerShell Instead of Other Shells?

PowerShell’s strength lies in how it handles data. Unlike shells like Bash that treat data as plain text strings, PowerShell treats data as structured objects. This fundamental difference leads to:

  • More powerful and readable scripts
  • Direct manipulation of data without relying on parsing text
  • Easier integration with APIs and SDKs

A great example of this is when working with cloud providers like AWS. The AWS SDK for PowerShell allows developers to perform API operations using native PowerShell syntax. The structured nature of PowerShell means that data can be passed directly to API calls as objects, simplifying development and reducing errors.

Creating a Safe Testing Environment with Containers

Developers frequently need to test scripts and applications in controlled environments. Making changes to a development system can be risky and hard to roll back. Docker containers provide an ideal solution:

  • They isolate changes from the host system
  • Containers can be started and stopped as needed.
  • Different configurations can be created for different testing scenarios.s

For PowerShell development, this means that you can:

  • Create containers with specific versions of PowerShell
  • Run scripts inside that container.s
  • Ensure behavior matches expected production environment.s

Accessing Remote Containers in VS Code

The ability to access containers remotely in VS Code is facilitated by the Remote-Containers extension. This extension allows developers to:

  • Open folders inside containers
  • Use containers as development environments.
  • Attach to running containers for debugging and management.

Combining this with PowerShell gives you a powerful environment for both development and administration. For instance, a PowerShell script that configures a Linux server can be tested within a container mimicking that server’s environment, directly inside VS Code.

Prerequisites for Using Docker with PowerShell in VS Code

Before diving into implementation, a few prerequisites need to be satisfied:

  1. Docker Installation: Docker must be installed on the system. It acts as the container engine, managing the container lifecycle and networking.
    • On Linux: Use your distribution’s package manager (e.g., apt, yum, pacman)
    • On Windows: Download Docker Desktop from the Docker website.
  2. WSL (Windows Subsystem for Linux): On Windows, Docker often relies on WSL2. WSL2 provides a genuine Linux kernel running inside Windows, allowing Docker to function more efficiently.
  3. Visual Studio Code: Ensure you have the latest version of VS Code installed.
  4. Remote – Containers Extension: This extension bridges the gap between VS Code and Docker, enabling container-based development environments.
  5. PowerShell Docker Image: Microsoft provides official PowerShell images via the Microsoft Container Registry (MCR). These can be customized depending on the operating system base image needed.

Understanding the PowerShell Docker Image

Microsoft’s official PowerShell container images are available on Docker Hub. You can find them under mcr.microsoft.com/powershell. The general command to pull the image is:

docker pull mcr.microsoft.com/powershell

This command pulls the latest stable version of PowerShell. However, there are several variations available:

  • mcr.microsoft.com/powershell:latest – latest stable version
  • mcr.microsoft.com/powershell:lts-ubuntu-18.04 – built on Ubuntu 18.04
  • mcr.microsoft.com/powershell:alpine – a lightweight Alpine Linux image

Using different images allows for more precise environment control. For instance, if your production systems use Ubuntu 18.04, it’s beneficial to develop and test scripts in an Ubuntu 18.04 container to ensure compatibility.

Why Use Docker Containers in VS Code?

The integration of Docker containers into development workflows can vastly improve productivity, consistency, and portability. With containers, developers can:

  • Create isolated environments for testing and development
  • Avoid conflicts with software versions on the host system.
  • Reproduce environments that closely resemble production.n
  • Share setups easily with teams through Dockerfiles and images

When using VS Code, Docker containers can be accessed and managed through extensions and plugins that allow direct interaction with containerized environments. By combining this with PowerShell, developers and administrators gain a flexible, powerful shell environment within a reproducible container.

Evolution of PowerShell

PowerShell began its life as a Windows-only shell, designed for system administrators who needed an effective and extensible tool to manage systems. It provided powerful scripting capabilities and deep integration with Windows and Microsoft applications.

In 2016, Microsoft changed the trajectory of PowerShell with the introduction of PowerShell Core. Built on .NET Core, PowerShell Core was designed to be:

  • Cross-platform, supporting Windows, Linux, and macOS
  • Open source, allowing community involvement
  • Modular, to support faster development and deployment cycles

This move was significant because it opened the door for developers and administrators to use PowerShell outside of the traditional Windows ecosystem. While PowerShell Core lacks some of the deep integration features of Windows PowerShell (due to proprietary dependencies), it brings a level of portability that is essential in modern DevOps workflows.

Compatibility Considerations

Not all scripts written for Windows PowerShell are compatible with PowerShell Core. There are differences in available modules, cmdlets, and system integrations. As a result, it becomes important to test scripts in different versions of PowerShell.

This need for compatibility testing is one of the main reasons to use containers. With Docker, you can:

  • Test scripts across multiple versions of PowerShell
  • Recreate issues in isolated environments.
  • Minimize risks before deploying updates or changes.

For example, if an update to PowerShell Core introduces changes that break a script, having a Docker container with the previous version allows quick validation and debugging.

Why Use PowerShell Instead of Other Shells?

PowerShell’s strength lies in how it handles data. Unlike shells like Bash that treat data as plain text strings, PowerShell treats data as structured objects. This fundamental difference leads to:

  • More powerful and readable scripts
  • Direct manipulation of data without relying on parsing text
  • Easier integration with APIs and SDKs

A great example of this is when working with cloud providers like AWS. The AWS SDK for PowerShell allows developers to perform API operations using native PowerShell syntax. The structured nature of PowerShell means that data can be passed directly to API calls as objects, simplifying development and reducing errors.

Creating a Safe Testing Environment with Containers

Developers frequently need to test scripts and applications in controlled environments. Making changes to a development system can be risky and hard to roll back. Docker containers provide an ideal solution:

  • They isolate changes from the host system
  • Containers can be started and stopped as needed.
  • Different configurations can be created for different testing scenarios.

For PowerShell development, this means that you can:

  • Create containers with specific versions of PowerShell
  • Run scripts inside those containers.
  • Ensure behavior matches expected production environments.

Accessing Remote Containers in VS Code

The ability to access containers remotely in VS Code is facilitated by the Remote-Containers extension. This extension allows developers to:

  • Open folders inside containers
  • Use containers as development environments.
  • Attach to running containers for debugging and management.

Combining this with PowerShell gives you a powerful environment for both development and administration. For instance, a PowerShell script that configures a Linux server can be tested within a container mimicking that server’s environment, directly inside VS Code.

Prerequisites for Using Docker with PowerShell in VS Code

Before diving into implementation, a few prerequisites need to be satisfied:

  1. Docker Installation: Docker must be installed on the system. It acts as the container engine, managing the container lifecycle and networking.
    • On Linux: Use your distribution’s package manager (e.g., apt, yum, pacman)
    • On Windows: Download Docker Desktop from the Docker website.
  2. WSL (Windows Subsystem for Linux): On Windows, Docker often relies on WSL2. WSL2 provides a genuine Linux kernel running inside Windows, allowing Docker to function more efficiently.
  3. Visual Studio Code: Ensure you have the latest version of VS Code installed.
  4. Remote – Containers Extension: This extension bridges the gap between VS Code and Docker, enabling container-based development environments.
  5. PowerShell Docker Image: Microsoft provides official PowerShell images via the Microsoft Container Registry (MCR). These can be customized depending on the operating system base image needed.

Understanding the PowerShell Docker Image

Microsoft’s official PowerShell container images are available on Docker Hub. You can find them under mcr.microsoft.com/powershell. The general command to pull the image is:

docker pull mcr.microsoft.com/powershell

This command pulls the latest stable version of PowerShell. However, there are several variations available:

  • mcr.microsoft.com/powershell:latest – latest stable version
  • mcr.microsoft.com/powershell:lts-ubuntu-18.04 – built on Ubuntu 18.04
  • mcr.microsoft.com/powershell:alpine – a lightweight Alpine Linux image

Using different images allows for more precise environment control. For instance, if your production systems use Ubuntu 18.04, it’s beneficial to develop and test scripts in an Ubuntu 18.04 container to ensure compatibility.

Getting Started with PowerShell in Docker Containers

Now that the background is covered, let’s begin the hands-on portion of the series. First, install Docker if you haven’t already. Once Docker is running:

Pull the PowerShell Docker Image

Open a terminal window and enter:

docker pull mcr.microsoft.com/powershell

You can replace this with a different tag for a specific version or OS base.

Run the PowerShell Container

To start a container with PowerShell interactively, use:

Docker run -it mcr.microsoft.com/powershell pwsh

This command runs PowerShell in interactive mode within the container. You can begin running commands and testing scripts here.

Create a Container with Ubuntu 18.04

For example, to test scripts in an Ubuntu 18.04 environment:

docker pull mcr.microsoft.com/powershell:lts-ubuntu-18.04

Then run it:

Docker run -it mcr.microsoft.com/powershell:lts-ubuntu-18.04 pwsh

Create a Dockerfile for Customization

If you want to build a custom container, create a Dockerfile like this:

FROM mcr.microsoft.com/powershell:lts-ubuntu-18.04

RUN apt-get update \

    && apt-get install -y git curl \

    && rm -rf /var/lib/apt/lists/*

ENTRYPOINT [“pwsh”]

Then build the image:

Docker build -t custom-powershell.

And run it:

Docker run -it custom-powershell

Verify PowerShell Version

Once inside the container, confirm which version you are using:

$PSVersionTable

This confirms you are in the expected environment.

Installing the Remote – Containers Extension

To begin, you will need to install the “Remote – Containers” extension for VS Code. This extension allows the editor to connect to Docker containers as if they were local development environments.

Steps to Install:

  1. Open VS Code
  2. Go to the Extensions view (Ctrl+Shift+X)
  3. Search for “Remote – Containers”
  4. Click “Install.”

Once installed, the extension adds new commands and options to the Command Palette (Ctrl+Shift+P) under the “Remote-Containers” namespace.

Cloning or Creating a Project

To work with containers in VS Code, you either need an existing project or a new one. Let’s walk through both cases.

Cloning an Existing Project

If your code repository already has Docker support:

  1. Open the Command Palette
  2. Select “Remote-Containers: Clone Repository in Container Volume”
  3. Enter the repository URL

VS Code will clone the repo and immediately open it inside a Docker container defined by the project’s configuration.

Creating a New Project

If you’re starting from scratch:

  1. Create a new folder for your project
  2. Add a Dockerfile based on PowerShell (as we did in Part 2)
  3. Create a .devcontainer directory.y

Inside the .devcontainer directory, add a devcontainer.json file with the following content:

{

  “name”: “PowerShell Dev Container”,

  “build”: {

    “dockerfile”: “../Dockerfile”

  },

  “settings”: {},

  “extensions”: [

    “ms-vscode.PowerShell”

  ],

  “postCreateCommand”: “pwsh -Command \”Install-Module -Name Az -Force -Scope CurrentUser\””

}

Explanation:

  • name: The display name of the container environment in VS Code
  • Build: Points to the Dockerfile used for building the container.
  • Settings: Allows you to override VS Code settings within the container
  • extensions: VS Code extensions to install inside the container
  • postCreateCommand: Command to run after the container is created (useful for installing modules)

Opening the Folder in a Container

Once the configuration is in place:

  1. Open the Command Palette
  2. Select “Remote-Containers: Open Folder in Container”
  3. Choose the root project folder.

VS Code will now build the Docker container based on your Dockerfile and devcontainer.json configuration. It will then reopen the project inside the container, with terminal and integrated tools all running from within that environment.

Working with PowerShell Inside the Container

With the container open:

  • Open a terminal in VS Code (Ctrl+`)
  • The default shell will be PowerShell.
  • You can execute any PowerShell command or script as you would on a native machine.

This container environment is ideal for testing PowerShell scripts, modules, and automation workflows in a controlled, reproducible way.

Checking Installed Modules

Run the following command to list all available modules:

Get-Module -ListAvailable

This helps verify that your environment is correctly configured with the modules you need.

Installing Additional Tools

You can further modify the Dockerfile to install CLI tools, SDKs, or package managers. For example, adding AWS CLI support:

RUN curl “https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip” -o “awscliv2.zip” \

    && unzip awscliv2.zip \

    && ./aws/install

Once rebuilt, the container will have AWS CLI available to use with PowerShell scripts.

Debugging PowerShell Scripts

VS Code’s PowerShell extension enables integrated debugging. You can:

  • Set breakpoints
  • Step through the code
  • Watch variables
  • Evaluate expressions

All of this works seamlessly within the container. You can test script logic, explore error conditions, and fine-tune behavior before deployment.

Example Launch Configuration

Add a .vscode/launch.json file:

{

  “version”: “0.2.0”,

  “configurations”: [

    {

      “type”: “PowerShell”,

      “request”: “launch”,

      “name”: “PowerShell Launch Script”,

      “script”: “${workspaceFolder}/myscript.ps1”,

      “args”: []

    }

  ]

}

This enables launching and debugging my script.PS1 directly from the Run panel.

Version Control and Shared Development

Using Docker and VS Code’s remote features ensures your entire team can:

  • Use the same development environment
  • Avoid system-specific bugs
  • Easily share and update environments.

Changes to the Dockerfile or devcontainer.json are tracked in version control, making collaboration simple.

Attaching to a Running Container in VS Code

There may be situations where you need to work with a container that is already running, perhaps one started by another developer or by a CI/CD tool. The Remote – Containers extension provides the ability to attach to any running container and open it in VS Code.

Steps to Attach to a Container:

  1. Ensure the container is running. You can verify this by running:

docker ps

  1. Open the Command Palette in VS Code.
  2. Select the command: “Remote-Containers: Attach to Running Container…”
  3. Choose the container from the list presented.

VS Code will then open the container, mount it as the workspace, and provide access to the terminal, filesystem, and debugging tools inside the container. This is particularly useful when dealing with services that are long-running or initiated outside of your local environment.

Using Docker Compose for Multi-Service Development

Sometimes, one container is not enough. For more complex applications, you may require additional services, such as a database, caching layer, or backend API, alongside your PowerShell scripts. Docker Compose provides a powerful solution for managing multi-container applications.

Basic Docker Compose File Structure:

Create a docker-compose.yml file in the root of your project:

version: ‘3.8’

Services:

  Powershell-dev:

    Build:

      Context: .

      dockerfile: Dockerfile

    Volumes:

      – .:/workspace

    command: pwsh

  Mongo:

    Image: mongo: latest

   Portss:

      – “27017:27017”

  Redis:

    Image: redis: alpine

   Portss:

      – “6379:6379”

This configuration spins up a PowerShell development environment along with MongoDB and Redis services. These can be used by your PowerShell scripts to simulate production environments.

Updating devcontainer.json for Compose:

In the .devcontainer/devcontainer.json file, update the configuration to use Docker Compose:

{

  “name”: “PowerShell Multi-Service Dev”,

  “dockerComposeFile”: [

    “../docker-compose.yml”

  ],

  “service”: “powershell-dev”,

  “workspaceFolder”: “/workspace”,

  “extensions”: [

    “ms-vscode.PowerShell”

  ]

}

Now, when you reopen the folder in a container, VS Code will build and launch all services defined in the Compose file and attach you to the powershell-dev service.

Creating Custom PowerShell Modules in Containers

One key advantage of using containerized environments is the ease with which you can create, test, and distribute PowerShell modules.

Example: Creating a Module

  1. Inside your container, create a new folder for your module:

mkdir MyModule

cd MyModule

  1. Create the module manifest:

New-ModuleManifest -Path “.\MyModule.psd1” -RootModule “MyModule.psm1”

  1. Add your PowerShell functions to MyModule.psm1:

function Get-Greeting {

    param([string]$Name)

    “Hello, $Name!”

}

  1. Import and test the module:

Import-Module ./MyModule.psd1

Get-Greeting -Name “ExamLabs”

You can include this module in your version control repository and use Docker to test it against multiple PowerShell versions.

Integrating PowerShell Containers into CI/CD Pipelines

Containers excel in CI/CD environments because they ensure consistency between development, testing, and production systems. Let’s look at how you can integrate PowerShell-based containers into a CI/CD workflow.

Using GitHub Actions

GitHub Actions allows you to define workflows that build and test code automatically. Here’s an example of a workflow that builds a Docker image and runs a PowerShell script inside it:

Name CI Pipeline

On:

  Push:

    Branches:

      – main

Jobs:

  Build:

    runs-on: ubuntu-latest

    Steps:

      – uses: actions/checkout@v3

      – name: Build Docker Image

        run: |

          docker build -t powershell-dev.

      – name: Run PowerShell Script

        run: |

          docker run powershell-dev pwsh -File ./scripts/test.ps1

This workflow automatically builds your container and runs your test script every time code is pushed to the main branch.

Using Azure DevOps

In Azure DevOps, you can define a pipeline using a YAML file. Here’s how to integrate PowerShell containers:

tTrigger

– main

Pool:

  vmImage: ‘ubuntu-latest’

Steps:

– checkout: self

– task: Docker@2

  inputs:

    command: ‘build’

    Dockerfile: ‘**/Dockerfile’

    tags: ‘latest’

– script: |

    docker run powershell-dev pwsh -Command “./scripts/test.ps1”

  displayName: ‘Run PowerShell Script’

This approach ensures your testing and deployment environments remain consistent, minimizing bugs caused by environment discrepancies.

Managing Environment Variables and Secrets

Handling environment variables and secrets is critical in production-level container workflows. You can pass variables at runtime or define them in Docker Compose files.

Passing Environment Variables:

In Docker CLI:

Docker run -e “API_KEY=yourapikey” powershell-dev

In Docker Compose:

Services:

  Powershell-dev:

    Environment:

      – API_KEY=yourapikey

Inside PowerShell, you can access these variables using:

$env: API_KEY

For managing secrets, consider using Docker secrets or integrating with external vault systems like HashiCorp Vault or Azure Key Vault.

Using Containers for Version Compatibility Testing

One often overlooked benefit of containers is the ability to test your PowerShell code across multiple versions with minimal setup.

Testing Different PowerShell Versions:

You can create multiple Dockerfiles or use tags from the PowerShell Docker Hub:

docker run mcr.microsoft.com/powershell:7.2-ubuntu-20.04 pwsh -Command “./scripts/test.ps1”

docker run mcr.microsoft.com/powershell:7.0-alpine pwsh -Command “./scripts/test.ps1”

This technique helps ensure your scripts work as expected across various versions and platforms, a crucial step when distributing modules publicly or supporting legacy systems.

Monitoring and Logging in Containers

When working with containers, especially in production or shared environments, logging and monitoring become essential.

Logging PowerShell Output

You can redirect PowerShell output to a file:

pwsh -Command “./scripts/test.ps1 *>&1 | Out-File /logs/test.log”

You can also use centralized logging solutions like Fluentd, Logstash, or send logs to cloud platforms for real-time analysis.

Leave a Reply

How It Works

img
Step 1. Choose Exam
on ExamLabs
Download IT Exams Questions & Answers
img
Step 2. Open Exam with
Avanset Exam Simulator
Press here to download VCE Exam Simulator that simulates real exam environment
img
Step 3. Study
& Pass
IT Exams Anywhere, Anytime!