Aristotle and an Aardvark Go to Washington

Books: Aristotle and an Aardvark Go to Washington

I’ve recently completed “Aristotle and an Aardvark Go to Washington” by Thomas Cathcart and Daniel Klein. I bought 3 analogue books (hardcover, i.e. not eBooks) of these authors translated to Russian more than year ago and only now managed to read one of them :) The fact that these books were lying around for such a long time made me think that I should fully switch to eBooks – it at least saves some free space at home :)

In this book wrapped into humor, satire and anecdotes about US political arena you may find quite a few ideas/concepts from formal logic and epistemology which is quite in line with book’s subtitle “Understanding Political Doublespeak through Philosophy and Jokes”. Book actually touches a bit on epistemology, definition of truth, formal logic and quite thoroughly covers common argument fallacies but it is written in a way that you can consume it as an easy read without noticing this.

I especially liked the following in this book:

Chapter with tiny recap of definitions of truth. Book does not discuss in details classical account of knowledge spoiled by Edmund Gettier who introduced cases where classical account of knowledge fails, but it covers correspondence theory of truth (Bertrand Russell), coherence theory of truth (Hilary Putnam) and pragmatic theory of truth (Charles Sanders Peirce, William James, John Dewey). Book also suggest that it seems that former French president Jacques Chirac introduced his own theory according to which words which were said to journalist true if, and only if, they can be “recorded and published in press” :) .

Comprehensive overview of formal and informal argument fallacies covered by this book, which includes: appeal to authority, force argument or appeal to the stick (argumentum ad baculum), thesis replacement or irrelevant conclusion (ignoratio elenchi), appeal to hatred or appeal to spite (argumentum ad odium), argument from ignorance/appeal to ignorance (argumentum as ignoratiam), weak analogy, slippery slope argument, appeal to nature, appeal to human (argumentum ad hominem), appeal to hypocrisy or “you too!” (tu quokue!), mind projection fallacy, quoting out of context (aka contextomy/quote mining), equivocation, appeal to authority (argumentum ad verecundiam), accepting blame with condition, idea of kairos from classical rhetoric (eukairos and kakakairos). who is speaking? (qui dicit?), with this hence because of this (cum hoc ergo propter hoc aka “correlation does not imply causation“), after this hence because of this (post hoc ergo propter hoc) and so on.

What’s important this overview is easy to read. Whether they are fallacies or tricks to use depends on your vantage point :)

Selected biographies of some talkers and demagogues in the end of the book. They are funny and reminded me the same historical writing style you can find in “An Utterly Impartial History of Britain: (or 2000 Years Of Upper Class Idiots In Charge)” by John O’Farrell.

Tiny bit of critique of “Freakonomics: A Rogue Economist Explores the Hidden Side of Everything” for bad use of statistics. Book hints that some of brilliant and unexpected conclusions which made this book bestseller do not seem to be justified, but sometimes we accept something like that just because it unusual and so on, without examining author arguments deeply enough. Actually “Freakonomics” was used to illustrate post hoc ergo propter hoc fallacy. But stories from that book so appealing to re-tell to others with a bit of suspense before you present conclusions that it is no wonder that this book sales exceeded 4 million copies.

I guess I will move on and finally start reading two other books I bought earlier along with this one – “Plato and a Platypus Walk into a Bar… Understanding Philosophy Through Jokes” and “Heidegger and a Hippo Walk Through Those Pearly Gates: Using Philosophy (and Jokes!) to Explore Life, Death, the Afterlife, and Everything in Between.”

Facebooktwittergoogle_plusredditpinterestlinkedinmail

How to disable Shutdown Event Tracker in Windows Server 2008 R2

I’m currently busy building some test environment which comprise multiple Server 2008 R2 boxes. I decided that I need to disable Shutdown Event Tracker as I don’t need this in test environment (it is that pesky feature which keep asking you to specify your reason for reboot or shutdown). It is controlled by “Display Shutdown Event Tracker” group policy setting which can be found under Computer Configuration > Administrative Templates > System but I guess scripting way is more preferable. So easiest way to disable Shutdown Event Tracker is to run this PowerShell script which in fact will set registry setting which corresponds to relevant group policy option:

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Toshiba Satellite L300

Installing Windows 8.1/10 on Toshiba Satellite L300

Just a little note which maybe of interest for users of old hardware, in particular Toshiba Satellite L300 laptop. Earlier I wrote about adding SSD drive into it _, and my results were that with Windows Vista SSD doesn’t help that much: it is better with SSD but still extremely sluggish due to the fact that Vista does not support SSD properly. I also linked underwhelming performance to the fact that I put x64 Vista on 2 GB system (and there is commonly accepted opinion/rule of thumb of not installing x64 Windows on anything with less than 3 GB of RAM). Back then I was not able to install Windows 10 TP on this laptop (likely because of some missing driver).

Recently I finally revisited this laptop and successfully installed Windows 8.1 which was immediately upgraded to Windows 10. So this laptop runs Windows 10 x64 with 2 GB RAM and with SSD and this configuration significantly faster and more responsive. Really good example that to unlock your hardware potential you need support from software, and also illustrates quite well superiority and quality of work MSFT done in Windows 8.1 and 10 – it really works better than old versions even on the same hardware.

Slight issue was Canon Pixma 630 printer which was in use with this laptop. Pixma 630 officially not support Windows 10, so drivers available only for Windows 8.1 – but they are work just fine for Windows 10 – I was able to verify this.

Also in case you on the fence about x86 VS x64 problem your extra incentive for selecting x64 may be greater security and quality of drivers. Essentially x64 Windows 8/10 is more secure because it has: mandatory driver signing (x86 missing this), Address Space Layout Randomization aka ASLR (x86 has it but on x64 it is much more efficient due to larger address space), Kernel Patch Protection or KPP aka PatchGuard (prevents software, even drivers running in kernel-mode, from patching the Windows kernel for x86 this is technically possibly but never was implemented to preserve backward-compatibility with old software), Data Execution Protection (more strict and always enabled for x64 programs), WOW64 compatibility layer (enforces some restrictions on these 32-bit programs which runs on top of it) and guess what else? Dropped support for 16-bit applications :)

My conclusion is that nowadays I only would care to install x86 OS on hardware which has historical value as vintage piece for collectors or geeks and can’t support x64 OS, other than that I will either put x64 OS or would recommend upgrade hardware.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

SQL Server: How to attach all databases from specific folder

I recently had to change SQL Server instance collation using the steps I described in my earlier post. After this operation dozen of databases which were hosted on that instance become detached and I was able to see only set of instance’s system DBs. So I had to look for an option to attach all DB from folder via script as doing this manually for 20 or so databases a bit too much of an effort to me :)

There is “Attach all the databases in one folder” script available in TechNet Script Center which was able to solve this problem for me. All you need it pass DatabaseDir parameter to the function specifying folder with your DBs:

Just in case I provide PS code of this function below in case you don’t want to grab the same from TechNet Script Center. All credit goes to Ivan Josipovic.

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Memory Leaks Everywhere

K2 host server eats up my RAM! :) Oh, really?

One of the frequent type of the issues I have to work on is high RAM usage (normally description of such problem is accompanied by phrase “with no apparent reason”) by K2 host server service. Most of the time I’m trying to create meaningful K2 community KB articles based on support cases I work on but not always everything what I want to say fits in into Click2KB format. So to discuss  “K2 host server eats up my RAM/I think I see memory leak here” issue in detail I decided to write this blog post.

Common symptom and starting point here is that you noticed abnormally high RAM usage by K2 host server service, which maybe even leads to service crash or total unresponsiveness of your K2 platform. What’s next and what the possibilities here?

Of course it is all depends on what exactly you see.

I think it is quite expected to see that immediately after server reboot K2 service memory consumption is lower than after server works for a while: Once you rebooted your server it starts clean – all threads and allocated memory is clear hence low RAM usage. But as server warms up it starts checking if it has tasks to process, and it fires other activities like users and group resolution by ADUM manager, recording data in identity cache table and so on. The more task processing threads are active the more memory is required. And keep in mind your host server treads configuration if you increased your default thread pool limits you should realize that it allows server to use more available resources on your server.

Empty (no deployed processes, no active users) K2 host server service has really tiny memory footprint:

K2 empty server with default thread pool settings

As you can see it uses less than 300 MB of RAM. And even if double default thread pool settings (and I heard that for that resources allocated upfront) memory usage stays the same at least on the box without any load.

Now we switching to interesting stuff, i.e. what could it be if RAM usage of K2 service is abnormally high?

And here comes important point: if your process design or custom code has any design flaws or hardware is poorly sized for intended workload processing queue starts growing and it may lead to resource overuse. I.e. it is not a memory leak but bottleneck caused by such things as (and I’m listing them based on probability of being cause of your issue):
1) Custom code or process design. Easy proof that this is the cause is the fact that you unable to reproduce this “memory leak” on empty platform with no running processes. It does tell you that there is no memory leak in K2 platform base code in a way.

You can refer to process design best practices as a starting point here:
http://help.k2.com/kb000352

I seen enough cases when high memory usage was caused by inefficient process design choices (something like mass uploads to DB or updating 20 MS Word documents properties in a row designed so that file is being downloaded/uploaded 20 times from SharePoint instead of doing batch update with one download/upload of a file).

Also when next time you will see this high memory usage state before doing reboot execute the following queries against K2 database:

A) Check how many process running at the same time right now and if any of them constantly stays in running state:

It will give you number of running processes at specific point in time. Constantly having 20 process or more in status 1 may indicate a problem, but more importantly to execute this query multiple times with 1-2 minutes interval and see if some of the process instances with the same ID stays running constantly or for a very long time. This will be likely your “offending” process and you will want to check at which step it is so slow and so on.

B) Check for processes with abnormally high state size:

This query will return processes with largest state size in bytes. If any of the processes has state size more than 1 MB this is problematic process which causes memory over use most likely due to use of looping within the process.

Just some illustrative example of what else can be wrong (and possibilities are huge here :) ): my colleague run into an issue where K2 service process memory usage suddenly started growing at ~16 GB per day rate, and in the end the reason was that every 10 seconds K2 smartactions tried to process an email which was sent to K2 service account mailbox and at is the same account under which smaractions were configured and it lead to sort of cycle and each sending attempt eat up couple of MB of memory. It was only possible to see this with full logging level and during the night where there was no other activities on the server cluttering log files.

2) Slow response/high latency of external systems or network. Depending on design of your workflows they may have dependencies on external systems (SQL, SharePoint) and it could be the case that slow response from their side causing growth of queue on K2 side with memory usage growth (sort of vicious circle or something like race condition can be in play here and it is often difficult to untangle this and isolate root cause).

In such scenario it is better to:

A) At the time of an issue verify K2 host server logs and ADUM logs to see if there are any time outs or communication type of errors/exceptions.
B) Check all servers which comprise your environment (K2, SQL, SharePoint, IIS) and watch out for resource usage spikes and errors in Event Viewer (leverage “Administrative Events” view). K2 relies heavily on SQL where K2 DB is hosted and if it is undersized or overloaded (scheduled execution of some SSIS packages, scheduled antivirus scan or backup) and if it is slow to respond you may see memory usage growth/slowness on K2 server side.
If your servers virtualized confirm your K2 vServers placement with virtualization platform admins – K2 and K2 DB SQL instance should not coexist on the same vHost with I/O intensive apps (especially Exchange, SharePoint).

You should pay special attention to ADUM logs – if there are loads of errors those have to be addressed as K2 server may constantly waste resources on futile attempts to resolve some no longer existing SharePoint group provider (site collection deleted, but group provider still in K2) or resolving objects from non working domain (failed connectivity or trusts config). These resolution attempts eat up resources and may prevent ADUM from timely refreshing things which are needed for running processes by this making situation worse (growing queue).

IMPORTANT NOTE: It never woks in large organizations if you just ask your colleagues (SQL admins/virtualization admins) whether all is OK on their side – you will always get response that all is OK :) You have to ask specific questions and get explicit confirmation of things like VM placement, whether your K2 DB SQL instance is shared with any other I/O intensive apps. You want to have a list and go through it eliminating possibilities.
I personally worked with one client who spent months troubleshooting performance and reviewing their K2 solutions inside out and searching for a leak while it was solved in the end by moving K2 DB to a dedicated SQL server instance, and in a hindsight they realized that previously K2 DB coexisted with some obscure integration DB not heavily used but it had a SSIS package which was firing twice a day and maxed out SQL resources for couple of hours causing prolonged and different disruptions to their K2 system. Checking SQL was suggested from the very beginning and answer was we don’t have issues on SQL side, even after they asked twice their SQL admins.

3) Inadequate hardware sizing. To get an idea about how to size your K2 server you can look at this table:

Scale out

This may look a bit controversial to you but this table is from Performance and Capacity Planning document from K2 COE document and it illustrates how you have to scale out based on total number of users and number of concurrent user with base configuration of 1 server with 8GB of RAM. Depending on your current hardware configuration this may or may not support your idea of scaling up.

Also see these documents on sizing and performance:
http://help.k2.com/kb000589#
http://help.k2.com/kb000401#
K2 blackpearl Performance Testing

Also see this K2 community KB:
http://community.k2.com/t5/tkb/articleprintpage/tkb-id/TKB_blackpearl/article-id/610

4) Memory leak. This is rather unlikely as K2 code (like code of any other mature commercial software) goes through strict QA and testing, and, personally, I saw not more than 3 cases where there was memory leak type of an issue which had to be fixed in K2 – it was all in old versions and in very specific, not frequent scenarios.

If what you observe is not prolonged memory usage spikes which not going away by themselves, but your K2 service just at times maxing out resource usage but then all goes back to normal with no intervention from your side (such as K2 service/server restart) then it looks like insufficient hardware type of situation (though other issues I mentioned previously still may have influence here). Memory leak is rather imply that you need to stop service or something like this to resolve it.

If after checking all the points mentioned above you still suspect that there could be some memory leak I would recommend you to place K2 support case and prepare all K2 logs along with memory dumps collected in the low and high memory usage states (you can obtain instructions on collecting memory dumps from K2 support).

Facebooktwittergoogle_plusredditpinterestlinkedinmail

How to: list disks in Windows CMD

I guess everybody know their way with DIR command to list folder/drives content but what if you need to figure out which drives are available? I recently did a Windows Server 2012 core installation in native VHD boot scenario and it lead me to asking question above when trying to attach my VHDX file from Windows Setup Manager. So I decided to jot down options to list windows drive from CMD below:

1. WMIC option:

or:

2. fsutil utility:

3. diskpart. This was actually was the only way to list drives I was able to remember from the top of my head – but I didn’t like it as it requires entering into diskpart context and working from there. And it is easy to do something unrecoverable from this context :)

4. PowerShell:

I draw most of the options from this article _ which features screenshots and more details on these commands.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Windows 8.1: Supporting WindowsToGO

Provisioning and configuring of Windows To Go is a subject of 70-687 exam, but 70-688 touches upon supporting it. To create Windows To Go workspace you have to have Windows 8.1 Enterprise which shipped with Windows To Go Creator Wizard. It is also requires certified Windows To Go USB flash drive (list can be found here). It was a great disappointment to me when I was not able to use generic/non certified drive to create WTG workspace, but as I understand you can get away with any drive which reports itself as a fixed disk.

Key facts to be aware with regards to WTG:

  • You can create WTG workspace only from a Windows 8 Enterprise edition machine and Enterprise edition installation files are required for this (ISO/DVD/WIM)
  • You can use customized images
  • USB drive must be at least 32 GB or larger + WTG certified
  • You can’t use TPM with WTG. This is simple: TPM protects specific computer (bound to chip inside of it), but WTG intended for use on various machines. But you can use BitLocker + startup password.
  • Hibernate & Sleep are disabled by default, but can be enabled via group policy
  • Windows RE/resetting/refreshing are not available for WTG. Problematic drives have to be reimaged.
  • If WTG has Windows 8.1 Enterprise installed on it Windows Store apps can roam between multiple PCs with WTG drive.

Key facts about WTG host computer (one you started with WTG drive):

  • Must have hardware certified to work with Windows 7/8
  • Must not be Windows RT or Mac
  • Should be considered as temporary host
  • Meet addtional requirements: USB boot support, CPU architecture have to support WTG image architecture, USB 2.0 port or newer and no USB hub, meet minimum requirements of Windows 8 (1 GHz CPU, 2 GB RAM, DirectX 9 graphics card with WDDM 1.2 or newer driver).

Probably most valuable for certification exam is to wrap your head around this table which simply tells which image you need to have to for specific firmware/processor architecture.

WTG hosting requirements - image compatibility

It is worth noting that legacy 64-bit BIOS machine looks like maximum compatibility solution in the table above.

Somewhat unexpectedly WTG startup options located under Hardware And Sound > Devices And Printers in Control Panel.

WTG startup options

It is not so important for real life as you either use search to reach anything or CLI commands but exam may ask you about this.

There is some group policies related to WTG but I will describe them in separate post later.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

How to do a quick check of IIS version

Just a brief note on checking IIS version installed on the server. In case you doing a lot of support or scripting you should appreciate CLI way of doing this which doesn’t imply giving extensive instructions of getting the same information from GUI, which no matter how detailed will always be more error prone.

So to get IIS version you may just execute the following in PS window:

Sample output from Windows Server 2016 TP5 (it returns 10.0.14300.1000):

Windows Server 2016 TP5 IIS version

Note: If this command doesn’t work but you pretty sure that IIS is installed on your box, then most likely IIS Management Console was not installed, and modern Windows administrator/IT pro should be able to quickly check this by issuing this command (see related question on stackowerflow.com):

And if it is missing just quickly add this via the following command:

Facebooktwittergoogle_plusredditpinterestlinkedinmail

K2 database collation requirement – finally we have it stated in the right place

If you read my old blog post on Installing SQL Server instance for K2 blackpearl you probably aware that K2 database requires very specific SQL Server instance collation in case you care to be in supported state. The main problem was that this requirement has been mentioned in quite obscure place which no sane person even reach in endless quest for knowledge :) That original requirement location was quite close to that joke about ginormous EULA and vendor injecting a sentence which says: “If you really read through this EULA till this place please give us a call to claim your 1000$ reward”…

There is no doubts that K2 is very flexible and versatile platform but when it comes to K2 database collation you only have a Hobson’s choice. Many product vendors have similar requirements for their back-end databases and there is nothing wrong with that, but problem we had in case of K2 was that a lot of people failed to realize that only one collation is required and supported, i.e. Hobson’s choice was poorly presented to them :)

Finally K2 blackpearl compatibility matrix was updated this month to reflect this requirement and I really hope this will clarify K2 collation requirement once and for all. We all agree that at least compatibility matrix is something we all read before rushing into installation or upgrade, right? 😉

So navigate to K2 blackpearl Compatibility Matrix page > SQL Server section notes and… lo and behold this:

blackpearl collation requirement

I hope this will help people to avoid collation related issues from now on and we are all clear that:

Latin1_General_CI_AS collation is required on the SQL server instance hosting the K2 database

Facebooktwittergoogle_plusredditpinterestlinkedinmail

The Jungle by Upton Sinclair

As I’m trying to cultivate a habit of reflecting on whatever information I consume I’m trying to write a blog post on each book I read. Recently I completed “The Jungle” by Upton Sinclair which I started to read with no knowledge about the author or plot apart from phrase that “this book propelled author’s political career” or something like this.

I allowed myself a bit of Wikipedia reading once I done with the book and it is interesting to see that there was a “Federal response” to the book by President Theodore Roosevelt who described this book as a “crackpot” because of the writer’s socialist positions. And to quote author of the book on socialism, he said in 1951:

“The American People will take Socialism, but they won’t take the label.”

Upton Sinclair, Letter to Norman Thomas (25 September 1951)

Irrespectively on whether you think socialism is a crackpot or not the book is worth reading unless you are that type of “oversimplify it” person who would never read anything like “Das Kapital” and employ the joke that “this book would have better to be burnt before it had seen the light as it had produced too many bloodshed, revolutions and couple of evil empires” as an excuse to not reading what that bearded guy meant to say to begin with.

First quarter or even half of this book is amazingly vivid description of squalor and hardships of wage worker at the time where wage slavery was a commonplace. But book take you through couple of cycles and unexpected turns through the eyes of naive Lithuanian immigrant to US who hoped to find his happiness doing decent work in The Yards.

1280px-Livestock_chicago_1947

Union Stock Yards, Chicago, 1947 (source)

My initial impression was that it you lack enthusiasm or excitement about your current job or workplace you should read this book it with switch your perception a little bit I’m sure. But as the story goes on and book elaborates on unscrupulous practices of business owners and world of politics you gradually will be returning back to reality thinking that in a way world does not changed as much as our shiny media channels present it to us (and given the fact that you have an ability to select your media channel you may end up living in “echo chamber” of reality of your choice where your view of the world supported by media and evidence which you selected just because it supports your view). Anyhow I had a chance to work on production line just a little bit at some point in my life and thanks god it was not meat production (but there was a chance to end up there at one point of my life :) ) and yes conditions are better but not radically as essentially the model is the same and with overwhelming win of consumerism and demonstrative consumption in society you may feel divide between wage worker and office staff even sharper: I still remember interesting feeling when while signing off from short stitch as a worker on production line I had to get some sign off from person responsible for personal’s food and on that occasion I had to visit administrative personnel canteen donned in my dirty working suit causing glances from neatly dressed administrative personnel. This is one thing to know that some people have separate canteen and sitting in clean office while you doing your shift in the noisy and and dusty environment of production line, and completely another to be exposed to such contrast – I kind of felt the divide and that type of “you don’t belong there” attitude back then. In short working conditions definitely way better nowadays but not as radically different as some dreamers or careless optimists never caring to look around them may think.

After author almost reaches the peak of his depiction of how poor life conditions may destroy one’s optimism, health and even system of values in life book takes unexpected turn. And here I can quote the book I guess:

“They were trying to save their souls—and who but a fool could fail to see that all that was the matter with their souls was that they had not been able to get a decent existence for their bodies?”

The first turn is towards showing dirty politics and defunct society system which I guess would be amazing read for somebody who takes for granted American hyper efficient image of dream state of freedom and equal opportunities which it projects masterfully with barrage of Hollywood movies and what not else. I guess I can recommend “Detroit: An American Autopsy” by Charlie LeDuff for those who need more up to date reality check to contrast image with reality.

And quite unexpectedly (for me, as I don’t read anything neither about author or about the book), our fallen and corrupt by the life on the bottom of Chicago’s society protagonist, which it seems about to die because of his miserable life conditions within a few pages or so, discovers new wonder and purpose and hope – socialism. And this is what is being uncovered in the last quarter of the book giving it a sort of almost happy end if you can call it so giving what had happened to our hero in the first half of the book.

And apart from socialism last jump or twist is on public healthcare in general and eating meat in particular – quite an interesting to see a passage arguing that meat producing industry is largely profit-driven attempt to earn on poor species whose brains not only programmed to be “pattern recognition machines” but, alas, also “pleasure seeking machines” and in more healthy “socialist” society moving away from meat consumption would be sort of natural and unavoidable.

Anyhow this was a strong book which worth reading and to conclude just one more quote from the book to spark your interest maybe:

“And now in the union Jurgis met men who explained all this mystery to him; and he learned that America differed from Russia in that its government existed under the form of a democracy. The officials who ruled it, and got all the graft, had to be elected first; and so there were two rival sets of grafters, known as political parties, and the one got the office which bought the most votes.”

Facebooktwittergoogle_plusredditpinterestlinkedinmail