<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[vNext Engineer by Daniel Gut]]></title><description><![CDATA[how modern system engineers embrace devops]]></description><link>https://vnextengineer.azurewebsites.net/</link><generator>Ghost 0.7</generator><lastBuildDate>Fri, 10 Apr 2026 01:13:43 GMT</lastBuildDate><atom:link href="https://vnextengineer.azurewebsites.net/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[PowerShell - Continuous Integration in Visual Studio Team Services]]></title><description><![CDATA[<p>Testing of PowerShell scripts is getting very popular recently. <a href="https://github.com/pester/Pester">Pester</a> is probably the most used testing framework for PowerShell. If you combine automated testing with a version control system like GIT you get a very powerfull development environment. For example if you change a script and check it into GIT</p>]]></description><link>https://vnextengineer.azurewebsites.net/powershell-continuous-integration/</link><guid isPermaLink="false">8ba2f8c5-e814-41f3-a4e6-03175d512154</guid><category><![CDATA[PowerShell]]></category><category><![CDATA[TFS]]></category><category><![CDATA[VSTS]]></category><category><![CDATA[Pester]]></category><dc:creator><![CDATA[Daniel Gut]]></dc:creator><pubDate>Thu, 07 Jan 2016 13:52:35 GMT</pubDate><media:content url="https://vnextengineer.azurewebsites.net/content/images/2016/01/results-1.png" medium="image"/><content:encoded><![CDATA[<img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/results-1.png" alt="PowerShell - Continuous Integration in Visual Studio Team Services"><p>Testing of PowerShell scripts is getting very popular recently. <a href="https://github.com/pester/Pester">Pester</a> is probably the most used testing framework for PowerShell. If you combine automated testing with a version control system like GIT you get a very powerfull development environment. For example if you change a script and check it into GIT you can do automated tests and it will report if the changes break something. This workflow is called continuous integration and is fundamental to the DevOps workflow in traditional software development.</p>

<p>This tutorial will guide you through a simple automated testing scenario. The version control system we use for this demo is GIT on <a href="https://www.visualstudio.com/products/visual-studio-team-services-vs">Visual Studio Team Services</a> or an on-premise <a href="https://www.visualstudio.com/products/tfs-overview-vs">Team Foundation Server 2015</a> installation.</p>

<p>To kick start this tutorial I've a very simple script with two Pester tests. I won't go into details here how Pester works or how you use GIT. </p>

<p>You can download the sample here: <a href="https://github.com/OneCyrus/vNextEngineer/raw/master/PowerShell/CIDemo/PowerShellCIDemo.zip">PowerShell CI Demo Scripts</a></p>

<h4 id="projectstructure">Project structure</h4>

<p>The structure of the demo project is divided into the following folders:</p>

<ul>
<li>PSExample
<ul><li><strong>Pester</strong> (Folder with the original Pester files)</li>
<li><strong>PSExample</strong> (Folder with the PowerShell script and the Pester tests for this script)</li>
<li><strong>RunPester.ps1</strong> (this executes the Pester tests)</li></ul></li>
</ul>

<p>You can see that the Pester framework and the actual PowerShell script is seperated into different folders. This helps to run the Pester tests only on the actual Script and doesn't execute the tests for Pester itself. If you have a lot of PowerShell scripts you probably even want to seperate you test-scripts into another folder.</p>

<p>The RunPester.ps1 is only to execute the Pester tests with the correct parameters. The two important things in this are the <code>OutputFormat</code>, which specifies the format of the test results, and the <code>OutputFile</code> tells where this will be saved.</p>

<pre><code class="language-powershell">Import-Module "$PSScriptRoot\Pester\Pester.psm1"  
Invoke-Pester -CodeCoverage *.ps1 -Path "$PSScriptRoot\PSExample" -OutputFormat NUnitXml -OutputFile "$PSScriptRoot\TestResult.xml"  
</code></pre>

<h4 id="builddefinition">Build Definition</h4>

<p>Now that we have a working demo project with tests we need to create a build definition. Build definitions are usually used for compiling source code to executables but we can use them to test our PowerShell scripts.</p>

<ol>
<li><p>Create a new <strong>empty</strong> build definition and click next. <img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/NewBuildDefinition.png" alt="PowerShell - Continuous Integration in Visual Studio Team Services" title=""></p></li>
<li><p>On the second page just enable the checkbox "Continuous integration". <img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/CreateBuildDef.png" alt="PowerShell - Continuous Integration in Visual Studio Team Services" title=""></p></li>
<li><p>After the wizard finishes we need to a add two steps to our <br>
build definition. First the <strong>PowerShell</strong> task. <br>
<img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/PowerShellExecute.png" alt="PowerShell - Continuous Integration in Visual Studio Team Services" title=""> and then a <strong>Publish Test Results</strong><img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/PublishTests.png" alt="PowerShell - Continuous Integration in Visual Studio Team Services" title=""></p></li>
<li><p>We need to configure the PowerShell task. The important thing here is to select the <strong>RunPester.ps1</strong>. This will execute the Pester tests while building. <img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/PowerShellConfig.png" alt="PowerShell - Continuous Integration in Visual Studio Team Services" title=""></p></li>
<li><p>For the second task we need to configure the test result format and set it to <strong>NUnit</strong> (the one we specified in the RunPester.ps1) and the <strong>Test Results Files</strong> to the OutputFile from above too. <br>
<img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/publishtests2.PNG" alt="PowerShell - Continuous Integration in Visual Studio Team Services"></p></li>
<li><p>Now we can click <strong>save</strong> and to manually start a test we can click <strong>Queue build...</strong>. This will show you the execution of the workflow in a console view. <br>
<img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/running.png" alt="PowerShell - Continuous Integration in Visual Studio Team Services" title=""> when the build is complete we can view the full test results like this (the demo script has an intentional geolocation error):
<img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/results.png" alt="PowerShell - Continuous Integration in Visual Studio Team Services"></p></li>
</ol>

<p>Now every time you check a change into this GIT repository the tests will be executed. You can even let you send a mail when there's an error.</p>

<p>This concludes this tutorial. It should show you the basics of integration tests with PowerShell. There are a lot of advanced options you can explore and customize the experience for your own workflow.</p>]]></content:encoded></item><item><title><![CDATA[PowerShell Monitoring (Application Insights)]]></title><description><![CDATA[<p>Have you ever heard of <a href="https://azure.microsoft.com/en-us/services/application-insights/">Application Insights</a>? It's basically a monitoring service for developers. You can integrate it into your own applications and the results are some very nice charts. You can even build your own dashboard. There are quite a lot of different SDKs for the most popular languages.</p>]]></description><link>https://vnextengineer.azurewebsites.net/powershell-application-insights/</link><guid isPermaLink="false">f11a2926-8d01-40cc-a2e6-5c28eca86667</guid><category><![CDATA[Monitoring]]></category><category><![CDATA[DevOps]]></category><category><![CDATA[Azure]]></category><category><![CDATA[PowerShell]]></category><category><![CDATA[ApplicationInsights]]></category><dc:creator><![CDATA[Daniel Gut]]></dc:creator><pubDate>Sun, 03 Jan 2016 12:45:00 GMT</pubDate><content:encoded><![CDATA[<p>Have you ever heard of <a href="https://azure.microsoft.com/en-us/services/application-insights/">Application Insights</a>? It's basically a monitoring service for developers. You can integrate it into your own applications and the results are some very nice charts. You can even build your own dashboard. There are quite a lot of different SDKs for the most popular languages. Wouldn't it be nice to have your scheduled PowerShell scripts monitored as well? Sadly there's currently no specific SDK for PowerShell. But it's actually quite easy to use the .Net SDK in PowerShell.</p>

<p><img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/AIChart.PNG" alt=""></p>

<h4 id="createapplicationinsightsresourceinazure">Create Application Insights resource in Azure</h4>

<p>Go to <a href="https://portal.azure.com">https://portal.azure.com</a> and create a new Application Insights resource. There is almost no configuration required.</p>

<p><img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/AIcreation.PNG" alt=""></p>

<p>when the AI resource is created you'll get the following information. The most important part is the instrumentation key. This is the key which identifies this resource and PowerShell will need this key to send the tracking information.</p>

<p><img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/AIKey2.png" alt=""></p>

<h4 id="powershellintegration">PowerShell integration</h4>

<p>Now that everything is setup to collect the information we need to actually generate some data. In a first step we need to load the corresponding assembly from the NuGet package. For this you need the .nupkg from the following link: <a href="https://www.nuget.org/api/v2/package/Microsoft.ApplicationInsights/1.2.3">https://www.nuget.org/api/v2/package/Microsoft.ApplicationInsights/1.2.3</a></p>

<p>nupkg is just a renamed zip-file. So we can rename it to <strong>microsoft.applicationinsights.1.2.3.zip</strong> and extract the data. There are a couple of different files for different platforms. In this example we're going to use <strong>\lib\net45\Microsoft.ApplicationInsights.dll</strong>.</p>

<p>With the following code you can initialize a <code>TelemetryClient</code> which will interact with the Application Insights backend. To send the data to correct AI resource you need to set it to the correct instrumentation key (the one we got above by creating the resource).</p>

<pre><code class="language-powershell">$AI = "$PSScriptRoot\Microsoft.ApplicationInsights.dll"
[Reflection.Assembly]::LoadFile($AI)

$InstrumentationKey = "d6a7f123-4567-4630-9da0-54f82d4a39c8"
$TelClient = New-Object "Microsoft.ApplicationInsights.TelemetryClient"
$TelClient.InstrumentationKey = $InstrumentationKey
</code></pre>

<p>There are a couple of ways to send information to the backend. The most basic one is a generic event <code>TrackEvent(String)</code>. It allows to send any string. After we tracked the event we have a <code>Flush()</code> command. This forces the TelemetryClient to send data to backend immediately. Otherwise the client will collect data for approximately one minute send send it then as a bulk job. If the PowerShell process closes before sending the data you won't be able to see anything in the AI portal. So use a <code>Flush()</code> command to be sure we </p>

<pre><code class="language-powershell"># Event
$TelClient.TrackEvent("PowerShell rocks!")
$TelClient.Flush()
</code></pre>

<p>for advanced data collection we can use <code>TrackMetric(MetricTelemetry)</code>. This allows to track a value for a specific metric/counter. This allows to display charts where you can see how your script performed over a period of time.</p>

<pre><code class="language-powershell"># Metric
$TrackMetric = New-Object "Microsoft.ApplicationInsights.DataContracts.MetricTelemetry"
$TrackMetric.Name = "Powershell"
$TrackMetric.Value = 12
$TelClient.TrackMetric($TrackMetric)
$TelClient.Flush()
</code></pre>

<p>And one of my favorite features is the tracking of exceptions. Isn't that what we are the most interested about? It's not that different from the other tracking methods. You can use <code>TrackException()</code> and it takes a <code>ExceptionTelemetry</code> object as argument.</p>

<p>The following snipped forces a DivideByZeroException which is caught and reported to AI.</p>

<pre><code class="language-powershell"># Exception
try {  
    8/0 #DivideByZeroException
}
catch {  
    $TelException = New-Object "Microsoft.ApplicationInsights.DataContracts.ExceptionTelemetry"
    $TelException.Exception = $_.Exception
    $TelClient.TrackException($TelException)
    $TelClient.Flush()
}
</code></pre>

<p><br>  </p>

<h4 id="visualizethedata">visualize the data</h4>

<p>if you don't want to look at the data in the azure portal there's also a way to see those data in <a href="https://www.visualstudio.com/products/visual-studio-community-vs">Visual Studio 2015 Community (Free)</a></p>

<p>Just go to: View > other Windows > Application Insights Search</p>

<p>it looks like this: <br>
<img src="https://vnextengineer.azurewebsites.net/content/images/2016/01/VSAIUI.PNG" alt=""></p>

<p><br> <br>
there are a lot of ways you can use this data. In the future I'll do some more blog posts about advanced scenarios. check back!</p>]]></content:encoded></item><item><title><![CDATA[About]]></title><description><![CDATA[<p>Hi, my name is Daniel Gut and I'm based in Switzerland. Currently I'm working as an ICT architect at <a href="http://www.gia.ch">GIA Informatik AG</a> - a swiss IT service provider. My focus is on management and automation of data center infrastructure and enterprise clients. </p>

<p>While working over 15 years in this business</p>]]></description><link>https://vnextengineer.azurewebsites.net/about/</link><guid isPermaLink="false">f8e69c51-e2ec-479c-adde-3cbd95aa7603</guid><dc:creator><![CDATA[Daniel Gut]]></dc:creator><pubDate>Sun, 03 Jan 2016 12:19:32 GMT</pubDate><content:encoded><![CDATA[<p>Hi, my name is Daniel Gut and I'm based in Switzerland. Currently I'm working as an ICT architect at <a href="http://www.gia.ch">GIA Informatik AG</a> - a swiss IT service provider. My focus is on management and automation of data center infrastructure and enterprise clients. </p>

<p>While working over 15 years in this business I acquired valuable knowledge and experience. The intersection of software development and system engineering in the automation business is especially interesting as it allowed me to experience two different point of views. Over the last couple of years the requirements of both fields changed quite a bit. IT is no longer just a supporting tool - it is the foundation of a growing market. Digital transformation enables new business models which couldn't exist without today's infrastructure. This dependency requires a complete new level of reliable and professional systems.</p>

<p>To enable this requirements, workflows and tools need to adapt too. That's where I got the idea of this blog. How should a next generation system engineer look like? What tools should he use? And what is his role in the future? <br>
I think there are some very interesting topics happening in the developer space which most system engineers overlook. I'm trying to adapt this concepts and make them accessible for system engineers. Not everything works for everyone. I'm not here to sell you ideas - I'm here to inspire you.</p>

<p>You can find me always on twitter <a href="http://www.twitter.com/danielgut">@danielgut</a></p>]]></content:encoded></item></channel></rss>