How to install windows feature when you removed it with “- Remove” flag

Windows Server 2012 introduced so called “Features on Demand” which is nothing more than fancy term for removal of unused feature binaries (aka “feature payloads”) from your installation to decrease installation footprint, so that you can’t bring them back without using some external source. In conjunction with Minimal Server interface and post install ability to move along Full GUI – Core spectrum a lot of folks trying also to remove feature payloads, i.e. doing something like:

This works just fine, but bringing removed feature back maybe difficult, especially when you removed feature from OS with applied updates yet you do not have image with such updates applied. And even if you don’t have any updates you may have hard time figuring out how to properly specify source to bring removed feature back. So here is an example how to do it:

First you need to mount your Windows Server installation ISO and make sure that you know relevant drive letter (it is “D” in example below). Next list images available in install.wim file to identify image ID of Windows edition you need (Standard/Enterprise, FULL/CORE) by means of the following command:

Screenshot of the output:

dism-get-wiminfo

In my case it was necessary to use image with index id 2 (Standard edition, full install). With this info you can install required feature specifying image ID as a source:

Sample screenshot:

Install-WindowsFeature from source

I think this example will be enough for you to bring back your “features on demand” 🙂

 

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Domain controller without NTDS settings in Active Directory Sites and Services

Just a quick note as I go through 70-410 training from CBT Nuggets. Normally whenever you access Active Directory Sites and Services you may see list of domain controllers within your site/sites:

NTDS Settings

And each of them has NTDS Settings. But what if you see domain controller without NTDS settings here? It means that some of your domain controllers was demoted/removed from environment, but whenever you do this you have to remove domain controller from your site manually so that it doesn’t presented to you in this location/console.

And, yes, to configure existing DC as Global Catalog you just go to NTDS Settings Properties and select respective checkbox:

NTDS Settings - Global Catalog

Facebooktwittergoogle_plusredditpinterestlinkedinmail

K2 blackpearl Installation: Configuring MSDTC properties – Part 2

This post is an addition to my older post about configuring MSDTC in K2 environment and it has been triggered by the following error:

K2 Designer error - Partner Transaction manager disabled support for remote transactions

So basically I had freshly installed K2 4.6.6 environment (don’t ask me why I’m using such an old version 🙂 ) and it was first deployment of a simplistic workflow which gave me this error.

And if error message text which says: “The partner transaction manager has disabled its support for remote/network transactions. (Exception from HRESULT: 0x8004D025)” doesn’t tell you that something wrong with your MSDTC config then quick google search will confirm this to you.

The thing is that all you need to know is indeed covered by K2 documentation, but problem of any software documentation is that somewhat like a good dictionary its creation is driven by certain standards making it perfect for specific look ups for information but at the same time deter reader from reading it end to end, and by contrast with dictionaries software documentation does not have super simple data organization facilitating quick and precise look ups. So I mean rarely people do read specific sections of it unless they driven by specific error to specific page 🙂

To recap MSDTC side requirements for K2: You need to have it configured on all K2 servers and SQL servers used by K2 (have clusters? do config it on all nodes). As you have seen in my previous blog post it boils down to setting number of check boxes on Security tab of Local DTC properties which is reachable through the following commands: dcomcngf or comexp.msc (still keep forgetting these 🙂 ).

It is worth noting that K2 Setup Manager is capable to set these properties on K2 servers, but you have to go to SQL and set the same settings too. This was first correction I made in my environment after seeing this error. But it was enough. Looking a little bit more into K2 documentation I noticed this:

MSDTC Firewall config 1

I actually decided to this via GUI on SQL server, and what you need to do is to enable all 3 rules from MSDTC group:

MSDTC Firewall config 2

And you have to enable this on all K2 servers and SQL servers. Trust me, I tried to enable this on SQL servers only first 🙂 The same error persist till you enable it both on K2 and SQL servers.

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Reading OED

“Reading the OED: One Man, One Year, 21,730 Pages” by Ammon Shea

It has been silent here for a while as I was enjoying some time relaxing on the seaside. My holiday gave me a bit of time to catch up on couple of books which were on my reading list way too long. I managed to complete “Reading the OED: One Man, One Year, 21,730 Pages” by Ammon Shea and “Show Stopper!: The Breakneck Race to Create Windows NT and the Next Generation at Microsoft” by G. Pascal Zachary. This blog post about first one.

Looking on the title “Reading the OED: One Man, One Year, 21,730 Pages” you may get quite good idea about story line/contents of this book. It is indeed about a man who sat down for one year with paper set of OED volumes and read them all from A to Z. Are you intrigued already? Or like most of the normal people wondering “Why ?”… Essentially it is a book for vocabulary/dictionary geeks by dictionary geek. Author just sharing with you his experience as he goes through OED, paying deserved homage to masterful book of that class which is being judged by their completeness, yet never being read in their entirety.

I was really excited to read it as I have been well prepared/fascinated by OED thanks to books of Simon Winchester – “The Meaning of Everything” and “The Professor and the Madman: A Tale of Murder, Insanity, and the Making of the Oxford English Dictionary“. Also in the course of learning English ESOL-way and prepping to FCE/CAE/CPE exams I acquired great love/annoying habit of paying attention to advanced vocabulary and completeness of my English vocabulary to the extent that I can’t resist temptation to learn that new/old obscure/sesquipedalian word whenever I bump into one. I guess it is high time for me to order this mug:

sesquipidalian

What is interesting about this book is that its author is vocabulary/dictionaries geek, who complains in the book that even on the conference of professional linguists where he had a lot of fun his pursuit of reading OED from cover to cover were poorly understood, but still he wears vocabulary skeptic hat 🙂 Meaning that unlike some other authors of books about words/vocabularies (especially book like “Verbal Advantage”) he gives you a disclaimer right of the bat that big/advanced vocabulary or even worse knowledge of huge amount of really obscure words won’t bring you any tangible benefits. Rather, he warns us, after reading a lot of OED you may lose ability to communicate in normal language understood by people. But nonetheless there are a lot of fun in knowing a word which means specific thing an idea which you don’t know existed till you find it. It’s like “I always thought there should be a word for this and lo and behold – I found it finally!” 🙂

Another funny thing that despite being geek spending tons of time in library he is conscious enough to observe strange “library people” around him and even pause to reflect a bit whether he becomes one of them 🙂

So most of the people will decide whether to read this book or not after single glimpse to its title, which for me was enough to put this book on my reading list – really liked it.

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Aristotle and an Aardvark Go to Washington

Books: Aristotle and an Aardvark Go to Washington

I’ve recently completed “Aristotle and an Aardvark Go to Washington” by Thomas Cathcart and Daniel Klein. I bought 3 analogue books (hardcover, i.e. not eBooks) of these authors translated to Russian more than year ago and only now managed to read one of them 🙂 The fact that these books were lying around for such a long time made me think that I should fully switch to eBooks – it at least saves some free space at home 🙂

In this book wrapped into humor, satire and anecdotes about US political arena you may find quite a few ideas/concepts from formal logic and epistemology which is quite in line with book’s subtitle “Understanding Political Doublespeak through Philosophy and Jokes”. Book actually touches a bit on epistemology, definition of truth, formal logic and quite thoroughly covers common argument fallacies but it is written in a way that you can consume it as an easy read without noticing this.

I especially liked the following in this book:

Chapter with tiny recap of definitions of truth. Book does not discuss in details classical account of knowledge spoiled by Edmund Gettier who introduced cases where classical account of knowledge fails, but it covers correspondence theory of truth (Bertrand Russell), coherence theory of truth (Hilary Putnam) and pragmatic theory of truth (Charles Sanders Peirce, William James, John Dewey). Book also suggest that it seems that former French president Jacques Chirac introduced his own theory according to which words which were said to journalist true if, and only if, they can be “recorded and published in press” 🙂 .

Comprehensive overview of formal and informal argument fallacies covered by this book, which includes: appeal to authority, force argument or appeal to the stick (argumentum ad baculum), thesis replacement or irrelevant conclusion (ignoratio elenchi), appeal to hatred or appeal to spite (argumentum ad odium), argument from ignorance/appeal to ignorance (argumentum as ignoratiam), weak analogy, slippery slope argument, appeal to nature, appeal to human (argumentum ad hominem), appeal to hypocrisy or “you too!” (tu quokue!), mind projection fallacy, quoting out of context (aka contextomy/quote mining), equivocation, appeal to authority (argumentum ad verecundiam), accepting blame with condition, idea of kairos from classical rhetoric (eukairos and kakakairos). who is speaking? (qui dicit?), with this hence because of this (cum hoc ergo propter hoc aka “correlation does not imply causation“), after this hence because of this (post hoc ergo propter hoc) and so on.

What’s important this overview is easy to read. Whether they are fallacies or tricks to use depends on your vantage point 🙂

Selected biographies of some talkers and demagogues in the end of the book. They are funny and reminded me the same historical writing style you can find in “An Utterly Impartial History of Britain: (or 2000 Years Of Upper Class Idiots In Charge)” by John O’Farrell.

Tiny bit of critique of “Freakonomics: A Rogue Economist Explores the Hidden Side of Everything” for bad use of statistics. Book hints that some of brilliant and unexpected conclusions which made this book bestseller do not seem to be justified, but sometimes we accept something like that just because it unusual and so on, without examining author arguments deeply enough. Actually “Freakonomics” was used to illustrate post hoc ergo propter hoc fallacy. But stories from that book so appealing to re-tell to others with a bit of suspense before you present conclusions that it is no wonder that this book sales exceeded 4 million copies.

I guess I will move on and finally start reading two other books I bought earlier along with this one – “Plato and a Platypus Walk into a Bar… Understanding Philosophy Through Jokes” and “Heidegger and a Hippo Walk Through Those Pearly Gates: Using Philosophy (and Jokes!) to Explore Life, Death, the Afterlife, and Everything in Between.”

Facebooktwittergoogle_plusredditpinterestlinkedinmail

How to disable Shutdown Event Tracker in Windows Server 2008/2012

I’m currently busy building some test environment which comprise multiple Server 2008 R2 boxes. I decided that I need to disable Shutdown Event Tracker as I don’t need this in test environment (it is that pesky feature which keep asking you to specify your reason for reboot or shutdown). It is controlled by “Display Shutdown Event Tracker” group policy setting which can be found under Computer Configuration > Administrative Templates > System but I guess scripting way is more preferable. So easiest way to disable Shutdown Event Tracker is to run this PowerShell script which in fact will set registry setting which corresponds to relevant group policy option:

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Toshiba Satellite L300

Installing Windows 8.1/10 on Toshiba Satellite L300

Just a little note which maybe of interest for users of old hardware, in particular Toshiba Satellite L300 laptop. Earlier I wrote about adding SSD drive into it _, and my results were that with Windows Vista SSD doesn’t help that much: it is better with SSD but still extremely sluggish due to the fact that Vista does not support SSD properly. I also linked underwhelming performance to the fact that I put x64 Vista on 2 GB system (and there is commonly accepted opinion/rule of thumb of not installing x64 Windows on anything with less than 3 GB of RAM). Back then I was not able to install Windows 10 TP on this laptop (likely because of some missing driver).

Recently I finally revisited this laptop and successfully installed Windows 8.1 which was immediately upgraded to Windows 10. So this laptop runs Windows 10 x64 with 2 GB RAM and with SSD and this configuration significantly faster and more responsive. Really good example that to unlock your hardware potential you need support from software, and also illustrates quite well superiority and quality of work MSFT done in Windows 8.1 and 10 – it really works better than old versions even on the same hardware.

Slight issue was Canon Pixma 630 printer which was in use with this laptop. Pixma 630 officially not support Windows 10, so drivers available only for Windows 8.1 – but they are work just fine for Windows 10 – I was able to verify this.

Also in case you on the fence about x86 VS x64 problem your extra incentive for selecting x64 may be greater security and quality of drivers. Essentially x64 Windows 8/10 is more secure because it has: mandatory driver signing (x86 missing this), Address Space Layout Randomization aka ASLR (x86 has it but on x64 it is much more efficient due to larger address space), Kernel Patch Protection or KPP aka PatchGuard (prevents software, even drivers running in kernel-mode, from patching the Windows kernel for x86 this is technically possibly but never was implemented to preserve backward-compatibility with old software), Data Execution Protection (more strict and always enabled for x64 programs), WOW64 compatibility layer (enforces some restrictions on these 32-bit programs which runs on top of it) and guess what else? Dropped support for 16-bit applications 🙂

My conclusion is that nowadays I only would care to install x86 OS on hardware which has historical value as vintage piece for collectors or geeks and can’t support x64 OS, other than that I will either put x64 OS or would recommend upgrade hardware.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

SQL Server: How to attach all databases from specific folder

I recently had to change SQL Server instance collation using the steps I described in my earlier post. After this operation dozen of databases which were hosted on that instance become detached and I was able to see only set of instance’s system DBs. So I had to look for an option to attach all DB from folder via script as doing this manually for 20 or so databases a bit too much of an effort to me 🙂

There is “Attach all the databases in one folder” script available in TechNet Script Center which was able to solve this problem for me. All you need it pass DatabaseDir parameter to the function specifying folder with your DBs:

Just in case I provide PS code of this function below in case you don’t want to grab the same from TechNet Script Center. All credit goes to Ivan Josipovic.

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Memory Leaks Everywhere

K2 host server eats up my RAM! :) Oh, really?

One of the frequent type of the issues I have to work on is high RAM usage (normally description of such problem is accompanied by phrase “with no apparent reason”) by K2 host server service. Most of the time I’m trying to create meaningful K2 community KB articles based on support cases I work on but not always everything what I want to say fits in into Click2KB format. So to discuss  “K2 host server eats up my RAM/I think I see memory leak here” issue in detail I decided to write this blog post.

Common symptom and starting point here is that you noticed abnormally high RAM usage by K2 host server service, which maybe even leads to service crash or total unresponsiveness of your K2 platform. What’s next and what the possibilities here?

Of course it is all depends on what exactly you see.

I think it is quite expected to see that immediately after server reboot K2 service memory consumption is lower than after server works for a while: Once you rebooted your server it starts clean – all threads and allocated memory is clear hence low RAM usage. But as server warms up it starts checking if it has tasks to process, and it fires other activities like users and group resolution by ADUM manager, recording data in identity cache table and so on. The more task processing threads are active the more memory is required. And keep in mind your host server treads configuration if you increased your default thread pool limits you should realize that it allows server to use more available resources on your server.

Empty (no deployed processes, no active users) K2 host server service has really tiny memory footprint:

K2 empty server with default thread pool settings

As you can see it uses less than 300 MB of RAM. And even if double default thread pool settings (and I heard that for that resources allocated upfront) memory usage stays the same at least on the box without any load.

Now we switching to interesting stuff, i.e. what could it be if RAM usage of K2 service is abnormally high?

And here comes important point: if your process design or custom code has any design flaws or hardware is poorly sized for intended workload processing queue starts growing and it may lead to resource overuse. I.e. it is not a memory leak but bottleneck caused by such things as (and I’m listing them based on probability of being cause of your issue):
1) Custom code or process design. Easy proof that this is the cause is the fact that you unable to reproduce this “memory leak” on empty platform with no running processes. It does tell you that there is no memory leak in K2 platform base code in a way.

You can refer to process design best practices as a starting point here:
http://help.k2.com/kb000352

I seen enough cases when high memory usage was caused by inefficient process design choices (something like mass uploads to DB or updating 20 MS Word documents properties in a row designed so that file is being downloaded/uploaded 20 times from SharePoint instead of doing batch update with one download/upload of a file).

Also when next time you will see this high memory usage state before doing reboot execute the following queries against K2 database:

A) Check how many process running at the same time right now and if any of them constantly stays in running state:

It will give you number of running processes at specific point in time. Constantly having 20 process or more in status 1 may indicate a problem, but more importantly to execute this query multiple times with 1-2 minutes interval and see if some of the process instances with the same ID stays running constantly or for a very long time. This will be likely your “offending” process and you will want to check at which step it is so slow and so on.

B) Check for processes with abnormally high state size:

This query will return processes with largest state size in bytes. If any of the processes has state size more than 1 MB this is problematic process which causes memory over use most likely due to use of looping within the process.

Just some illustrative example of what else can be wrong (and possibilities are huge here 🙂 ): my colleague run into an issue where K2 service process memory usage suddenly started growing at ~16 GB per day rate, and in the end the reason was that every 10 seconds K2 smartactions tried to process an email which was sent to K2 service account mailbox and at is the same account under which smaractions were configured and it lead to sort of cycle and each sending attempt eat up couple of MB of memory. It was only possible to see this with full logging level and during the night where there was no other activities on the server cluttering log files.

2) Slow response/high latency of external systems or network. Depending on design of your workflows they may have dependencies on external systems (SQL, SharePoint) and it could be the case that slow response from their side causing growth of queue on K2 side with memory usage growth (sort of vicious circle or something like race condition can be in play here and it is often difficult to untangle this and isolate root cause).

In such scenario it is better to:

A) At the time of an issue verify K2 host server logs and ADUM logs to see if there are any time outs or communication type of errors/exceptions.
B) Check all servers which comprise your environment (K2, SQL, SharePoint, IIS) and watch out for resource usage spikes and errors in Event Viewer (leverage “Administrative Events” view). K2 relies heavily on SQL where K2 DB is hosted and if it is undersized or overloaded (scheduled execution of some SSIS packages, scheduled antivirus scan or backup) and if it is slow to respond you may see memory usage growth/slowness on K2 server side.
If your servers virtualized confirm your K2 vServers placement with virtualization platform admins – K2 and K2 DB SQL instance should not coexist on the same vHost with I/O intensive apps (especially Exchange, SharePoint).

You should pay special attention to ADUM logs – if there are loads of errors those have to be addressed as K2 server may constantly waste resources on futile attempts to resolve some no longer existing SharePoint group provider (site collection deleted, but group provider still in K2) or resolving objects from non working domain (failed connectivity or trusts config). These resolution attempts eat up resources and may prevent ADUM from timely refreshing things which are needed for running processes by this making situation worse (growing queue).

IMPORTANT NOTE: It never woks in large organizations if you just ask your colleagues (SQL admins/virtualization admins) whether all is OK on their side – you will always get response that all is OK 🙂 You have to ask specific questions and get explicit confirmation of things like VM placement, whether your K2 DB SQL instance is shared with any other I/O intensive apps. You want to have a list and go through it eliminating possibilities.
I personally worked with one client who spent months troubleshooting performance and reviewing their K2 solutions inside out and searching for a leak while it was solved in the end by moving K2 DB to a dedicated SQL server instance, and in a hindsight they realized that previously K2 DB coexisted with some obscure integration DB not heavily used but it had a SSIS package which was firing twice a day and maxed out SQL resources for couple of hours causing prolonged and different disruptions to their K2 system. Checking SQL was suggested from the very beginning and answer was we don’t have issues on SQL side, even after they asked twice their SQL admins.

3) Inadequate hardware sizing. To get an idea about how to size your K2 server you can look at this table:

Scale out

This may look a bit controversial to you but this table is from Performance and Capacity Planning document from K2 COE document and it illustrates how you have to scale out based on total number of users and number of concurrent user with base configuration of 1 server with 8GB of RAM. Depending on your current hardware configuration this may or may not support your idea of scaling up.

Also see these documents on sizing and performance:
http://help.k2.com/kb000589#
http://help.k2.com/kb000401#
K2 blackpearl Performance Testing

Also see this K2 community KB:
http://community.k2.com/t5/tkb/articleprintpage/tkb-id/TKB_blackpearl/article-id/610

4) Memory leak. This is rather unlikely as K2 code (like code of any other mature commercial software) goes through strict QA and testing, and, personally, I saw not more than 3 cases where there was memory leak type of an issue which had to be fixed in K2 – it was all in old versions and in very specific, not frequent scenarios.

If what you observe is not prolonged memory usage spikes which not going away by themselves, but your K2 service just at times maxing out resource usage but then all goes back to normal with no intervention from your side (such as K2 service/server restart) then it looks like insufficient hardware type of situation (though other issues I mentioned previously still may have influence here). Memory leak is rather imply that you need to stop service or something like this to resolve it.

If after checking all the points mentioned above you still suspect that there could be some memory leak I would recommend you to place K2 support case and prepare all K2 logs along with memory dumps collected in the low and high memory usage states (you can obtain instructions on collecting memory dumps from K2 support).

Facebooktwittergoogle_plusredditpinterestlinkedinmail

How to: list disks in Windows CMD

I guess everybody know their way with DIR command to list folder/drives content but what if you need to figure out which drives are available? I recently did a Windows Server 2012 core installation in native VHD boot scenario and it lead me to asking question above when trying to attach my VHDX file from Windows Setup Manager. So I decided to jot down options to list windows drive from CMD below:

1. WMIC option:

or:

2. fsutil utility:

3. diskpart. This was actually was the only way to list drives I was able to remember from the top of my head – but I didn’t like it as it requires entering into diskpart context and working from there. And it is easy to do something unrecoverable from this context 🙂

4. PowerShell:

I draw most of the options from this article _ which features screenshots and more details on these commands.

Facebooktwittergoogle_plusredditpinterestlinkedinmail