Creating Hyper-V VM with Ubuntu Server

I’ve recently decided to learn a bit about Ubuntu and going to do some project based on this platform, hence this little post describing how to create Ubuntu Server Hyper-V VM.

First of all, you need to download latest Ubuntu Server installation media from here, selecting between LTS (Long Term Support) and regular version:

LTS version is more tested and enterprise focused version which is released every 2 years and has 5 years support cycle.

Once you have installation media you just need to create Hyper-V VM allocating desired quantity of resources to it (note that this OS has quite humble minimum requirements) and make your VM Generation choice.

Despite the fact that process of creating VM is more or less the same for any OS I decided to write down all the steps involved into setting up Ubuntu Server VM.

You can follow these steps to create your own Hyper-V VM with Ubuntu Server OS. Right click on your Hyper-V host and select New > Virtual Machine:

Just click Next on Before You Begin page:

Specify name and location for your VM (be sure to specify your preferred VMs folder, VM specific subfolder will be created automatically based on VM name you type in):

Select Generation of your VM (note this cannot be changed once VM is created):

I wanted Generation 2 VM so I’ve selected this option (refer to MSFT documentation for information on choosing VM generation). Note that for Ubuntu VM you need to disable secure boot feature which will be enabled by default on Generation 2 VM.

Assign desired amount of memory and decide whether Dynamic Memory should be used:

Select virtual switch:

Adjust VHD settings if necessary:

Specify path to Ubunto ISO file you downloaded earlier:

Review selection you made and click on Finish:

Disable Secure Boot before powering on your VM – otherwise your VM fail to boot (as per MSFT documentation: “some Linux virtual machines will not boot unless the secure boot option is disabled”):

And while you are still in VM properties I would recommend you to disable automatic checkpoints (unless you want to use them):

Once you start VM setup process will be initiated automatically:

You will need to select preferred language:

Then keyboard settings:

And next, select Install Ubuntu option:

Accept default network connections settings:

And leave your proxy settings empty (unless you are using proxy server):

Accept default archive mirror address and hit Done:

Accept defaults on filesystem setup (which will mean use entire disk for our installation):

Select disc or accept selection if you have just one:

Accept default filesystem settings on the next page:

Agree with formatting selected drive (data loss warning):

Specify profile settings and server name (note that only small letters accepted for server and user names – great example of explicitness which leaves no chance for you to grown up into proficient user thinking that some case insensitive objects are case sensitive – happens way too often in more thanks to some user friendly OSs):

Select whether you want to install OpenSSH server:

Select any additional packages you may want to install:

Wait till installation go through remaining steps:

Hit Reboot Now once installation completes:

Once VM reboot completes you will be prompted to remove installation medium and hit ENTER (Hyper-V should auto remove it for you):

Once reboot completes Ubuntu Server should start and meet you with credentials prompt:

Once you type in your login and password correctly you will be invited to enter commands (no GUI installed on Server version by default):

At this point I suggest you to shutdown VM with shutdown -P now command and make your baseline VM snapshot.

Last do couple of more things before we wrap off our VM setup process. Let’s first install updates using sudo apt-get update (to fetch the list of available updates) and sudo apt-get upgrade (to upgrade installed packages):

And last but not least, let’s add GUI to our server – for that just use sudo apt-get install ubuntu-desktop confirming that you want to continue on additional space usage requirement consent step. Once setup completes you need to reboot your VM and it will start in GUI mode:

After clicking on your user icon, type in your password and click Sign In:

You will be presented with What’s new in Ubuntu splash screen:

This concludes VM installation and configuration process. Stay tuned for the new posts as I’m going to keep using this VM and documenting installation and configuration of additional packages and other things.

Sample PS script for bulk creation of AD DS groups

You know, sometimes need of creating 10 groups using ADUC groups for quick test is enough to fire off Windows PowerShell ISE and compose PS script… Below you can find little script to create any number of AD DS group you want, thanks to its compactness it may also serve you as an example of implementing WHILE cycle in PowerShell, so I’ll just leave it here.

Unable to add/remove K2 Environment Fields – “You are not authorized to perform the requested operation”

In certain scenarios (for example, when you changed your K2 administrative accounts) you may see the following error when trying to add or remove Environment Field in Environment Library:

You are not authorized to perform the requested operation

This may happen even for user which has been assigned K2 Administrator role in Setup Manager when custom security was configured on Environment Library and it didn’t include this specific account.

To resolve this (providing you have account with administrative rights) just look into Security settings available under list of variables themselves when you navigate to Environment Library > %Environment Library Name%:

Environment Variable Security Settings

Just add required user assigning him Modify rights to resolve this issue.

K2 Mobile Applications – Updated landing page

It used to be somewhat confusing with two mobile apps (K2 Workspace and K2 Mobile) for two platforms (iOS and Android), but recently updated K2 Mobile Applications help landing page makes things clear right off the bat making it easy for you to navigate to the right information:

K2 Mobile Applications Documentation Landing Page – App Version and Platform selection

There is also couple of useful links on the bottom of new landing page, namely Distributing K2 Mobile Application with MDM and K2 Mobile Support Policy:


K2 Mobile Applications Documentation Landing Page – Additional Resources

Really good job on K2 documentation team side 🙂 I really see that product documentation becomes better and easier to use.

Windows Server 2016: CDPUserSvc has stopped working

You may observe the following error on Windows Server 2016 immediately after OS startup:

CDPUUserSvc_65df7 has stopped working

This “has stopped working” part tells us that some unhandled exception occurred, so we can switch over to Event Viewer to find some more details about it:

svchost.exe_CDPUserSvc_65df7 – Exception code: 0xc0000005

Exception details are the following:

Faulting application name: svchost.exe_CDPUserSvc_65df7, version: 10.0.14393.0, time stamp: 0x57899b1c
Faulting module name: cdp.dll, version: 10.0.14393.1715, time stamp: 0x59b0d38c
Exception code: 0xc0000005
Fault offset: 0x0000000000193cf5
Faulting process id: 0x1b14

After quick research I found out that this error was introduced with some Microsoft updates and to resolve it on Windows Server 2016 14393.1884 you just need to apply another update 🙂 More specifically you need to install KB4053579, which can be downloaded from Windows Update Catalog. Applying this update resolves this error.

Unable to connect to named SQL instance on remote machine

Recently I was doing installation of K2 5.2 on Azure VMs with SQL server named instance hosted on separate Azure VM. I’ve created SQL Server alias on K2 VM but then run into issue – neither K2 Setup Manager nor SSMS were able to connect to SQL through alias. I next tried direct connection via server\instance name which also failed. SSMS showed me the following error:

Could not open a connection to SQL Server. Microsoft SQL Server, Error: 53

I first focused on network connectivity between VM:

  • Confirmed that I can ping SQL Server VM from K2 Server VM
  • Confirmed that no firewall enabled on VM and Azure VMs on the same network with nothing blocking connectivity between them
  • I tried to use telnet to test port 1433 – it failed

This is what kept me focused on network connectivity layer a bit longer then necessary. But after confirming that SQL Server not listening on port 1433 using netstat -na | find “1433” it became quite clear that focus should be on SQL Server configuration. First of all – by default named instance listen on dynamic port, and you actually need to have SQL Server Browser Service enabled to ensure you can connect to named instance without specifying port while using dynamic ports. But in my case it was not that as in SQL Server configuration there was explicitly specified custom port (SQL Server Configuration Manager > Protocols for %INSTANCE_NAME% > TCP/IP Properties > TCP Dynamic Ports – if you have anything other than 0 in IPAll section fir this setting you are not using dynamic ports). When your problem is dynamic ports and disabled SQL Server Browser Service error message from SSMS looks as follows:

SQL Network Interfaces, error: 26 – Error Locating Server/Instance Specified

As you can see error message explicitly tells you “Error Locating Server/Instance Specified. To fix this either set 0 for TCP Dynamic Ports setting and enable SQL Server Browser Service or specify some port number there. You sort of choosing your dependency here – either browser service (may fail to start) or custom port (may be hijacked by other service). It seems that browser service is better approach.

So in my case I was confused by expecting named instance to listen on default port which was, to put it simply, wrong expectation. Here is how you can check on which port your instance is listening:

Commands which you can use to find out your SQL Server instance ports

But obviously having access to SQL Server you can get this data from SQL Server Configuration manager too: SQL Server Configuration Manager > Protocols for %INSTANCE_NAME% > TCP/IP Properties. Just keep in mind that you need to check TCP Dynamic Ports value both for specific address and for IPAll section. But like I said in my case, the problem was not about ports. Once I found out instance port I noticed that I still cannot connect to it using telnet, just because IP address was not enabled in SQL Server Configuration Manager > Protocols for %INSTANCE_NAME% > TCP/IP Properties (meaning it had Enabled=0). I corrected that and telnet connectivity test succeeded.
Still, when I get back to SSMS I was getting the same error – “Could not open a connection to SQL Server. Microsoft SQL Server, Error: 53”. Reason? With SQL Server 2016 and latest versions of SQL Server, I keep forgetting that the latest and greatest version of SSMS still reads alias settings from x86 registry hive (meaning you need to configure SQL alias using cliconfg.exe from C:\Windows\SysWOW64) – I have a hard time getting use to it. Interestingly fully missing x86 alias triggers error message “Could not open a connection to SQL Server. Microsoft SQL Server, Error: 53” while one we configure with non existing server or instance name will give you “SQL Network Interfaces, error: 26 – Error Locating Server/Instance Specified”.

Anyhow your key takeaways from this post should be:

  • Know your instance port
  • Make sure that IP address is enabled
  • We still need to configure alias twice (x86/x64) to avoid unpleasant surprises from apps reading setting from non-configured location

I hope this post may save some troubleshooting time for someone.

First blog post for 2019 + new K2 blog announcement

I guess I’m a bit late for writing posts of the “looking back at 2018” and “new year resolutions for 2019” type as through the relevant time period I was busy migrating my blog from premium shared hosting provider to cloud hosting. The reason for the move was former provider inflexibility with payment options (I was OK with high price tag but was not OK with their desire of receiving it all upfront). Migration process involved some silly mistakes and forced WordPress internals learning, but I finally managed to resolve all issues and get my blog up and running (now with HTTPS 🙂 ).

I also keep writing blog posts for StarWind Blog, and recent one was about SharePoint 2019 installation. But something which may qualify for bigger of my NY resolutions for 2019 is a new blog about K2 which I’m going to do completely in Spanish. I don’t plan to put huge amount of content there very fast and probably will be also translating some of my old K2 related posts into Spanish. You can already bookmark new site address – k2bpm.es and stay tuned for new posts which will arrive as soon as I write them 🙂

Unable to create new/edit existing Oracle Service Service Instance after changing K2 installation path

Recently I bumped into a problem which was super obvious in retrospective, yet took me some time to untangle it. K2 environment was upgraded from 4.6.11 to 4.7 and K2 installation path was changed in the process (drive letter). After upgrade was completed without warnings or errors, we did some more testing and found that one of the forms which was using Oracle Service Instance based SmartObject started to throw an error similar to this one: 

Could not load file or assembly – SourceCode.SmartObjects.Services.Oracle.dll

Essentially it was very clear from the error message that Oracle Service instance keep looking for related assembly in old installation location (wrong drive letter). We switched to SmartObjects Services Tool only to see that there we are unable to edit or create new service instance of this service type. At this point I looked at old cases mentioning similar error message and surprisingly large amount of them was proposing workarounds and things not quite related with the root cause. We spend some time addressing missing prerequisite for this service type – 64-bit Oracle Data Access Components (ODAC) version 2.121.2.0 or higher, which mentioned as such in 4.7 user guide (_) and checking some related settings and so on.

But I next paid attention to the fact that environment had 2 service type for Oracle one of them was working, while another one does not. I next dropped assembly mentioned in error message in old installation location and restarted K2 service – it then fixed first Oracle service instance, but broken another one – it started to say that assembly SourceCode.SmartObjects.Services.Oracle.dll has been already loaded from another location, and this brought my focus back to the real problem – somehow one of the Oracle service types was not updated by K2 Setup Manager to use new installation path. Probably it was somehow “custom” and somehow was skipped by installer because of that. Anyhow my next step was finding where this path is defined. As soon as I confirmed that I cannot see/edit Service Type definition XML from SmartObjects Services Tool I switched to K2 database to check it there.

Necessary word of warning: Backup your K2 database before attempting any direct manipulations in it, and make sure you understand what you are doing before starting doing that 🙂

Service type definitions live in the follow [SmartBroker].[ServiceType] table, so I located “problematic” service type to check on its XML which is stored in ServiceTypeXML column. Here is the sample query to quickly search for service instance definition based on its Display Name:

Than will return you XML column value, on which you can click to view it as a formatted XML, here is an example of how it looks like:

Service Type XML

As you can easily service type definition contains assembly path parameter in its XML. So now it is only a question of updating it with correct value. Here is sample script to do that:

That will iron out problem with misbehaving service type. I don’t think that it can be very frequent problem as normally installer updates all the assembly paths definition with new path. But, especially if you have some custom service type, you may want to scan your service types definitions for any vestiges of old installation path. Here is a sample script which will display all Service Instances definitions which contain old drive letter reference (my example uses “D:\%” as a search criteria):

I hope that this blog post may help someone who may bump into similar error in K2 and if not, then maybe you can make use of SQL script samples which use filtering based on values within XML columns.

P.S. Note that all scripts mentioned above are for K2 4.7. In K2 Five (5.x) structure of the [SmartBroker].[ServiceType] table has been changed – it no longer has XML column named [ServiceTypeXML] and assembly path is stored in dedicated text column [AssemblyLocation] instead.