Site downtime / Synology DS415+ memory expansion

Some of you may seen a short period of downtime for my blog just a little while ago. In order to compensate for any inconveniences it could cause for those who notice it, here is a blog post explaining why it had happened.

For quite a while now I host my WordPress blog on Synology DS415+ NAS box and I’m going to continue to do so (at least no plans to move back to wordpress.com or something else so far). So I recently bought this:

DS415+ RAM Upgarde 001

This is 8 Gb SO-DIMM DDR-III Kingston (KVR16S11/8 PC3-12800) which I bought from some local vendor in order to replace default 2 GB memory module which DS415+ NAS has installed in its default configuration. Officially Synology not supports memory expansion which is quite obvious from amount of work you need to reach out SO-DIMM slot :) Also there is only one slot so you only can swap default module with new one. Number of people reported that they were successful with replacing memory modules on this box so I was pretty sure it will work out (don’t ask me about practical gains from memory increase – it’s a difficult part which I leave out in this blog post).

Entire process is straightforward and two question you may have is which memory modules will do and how to disassemble this box. Disassemble steps will be described below. As for memory modules I can say that factory preinstalled module labeled with DSL sticker and uses SEC chips (Samsung Electronics Co, Ltd.) and has the following specs: DDR3 1600 2GB CL11. So essentially I was able to replace it with 8 Gb PC-12800. By the way these fancy PC3-XXXXX numbers are module names in accordance with JEDEC standards, in particular PC3-12800 = DDR3-1600x standard, and 12800 is a theoretical bandwidth of a module.

So before replacing memory module we have 2 Gb:

Before - System Information

Below you may find disassembly process steps. First switching off and disconnecting the box, and placing it in some convenient place. As you can see working 24×7 this box collects loads of dust rather quickly:

DS415+ RAM Upgarde 002

Removing drives, in my case I have 2 6TB WD RED drives and 2 vacant places :) :

DS415+ RAM Upgarde 003

Box without drives/drive cages:

DS415+ RAM Upgarde 004Removing screws on the back side of the box (3 in total). One on the top:

DS415+ RAM Upgarde 007

And two on the bottom:

DS415+ RAM Upgarde 005

DS415+ RAM Upgarde 008

The most difficult part is to remove this upper cover. After looking at the similar set of pictures posted by somebody on Synology forums I was confused by the pictures there highlighting some locks on the top of the device, don’t dry to apply force there, actually you have to push from the inside of the box part of the cover which is opposed to leds/button side – push it from the inside and try to slid this cover back so that it free from the two clips located where you push and slid it to free the cover from the locks on the top. I marked area near from the locks where you have to push:

DS415+ RAM Upgarde 010

Like I said it is most difficult part. You may need to watch this video to get an idea how exactly to do this step:

All the next steps are easy. Completely removing cover:

DS415+ RAM Upgarde 011

Removing 4 screws holding the metal frame for drives (two from each side):

DS415+ RAM Upgarde 014

Remove plastic part on the top of the metal cage which connected with one of the fans:

DS415+ RAM Upgarde 015

Unscrew and remove from its slot plate with SATA connectors:

DS415+ RAM Upgarde 017

Then remove metal cage completely:

DS415+ RAM Upgarde 018

Remove plate with ports:

DS415+ RAM Upgarde 019

After this unscrew 2 screws on the metal cover and you finally get access to main board and memory slot. Here it is with dust and pre-installed 2Gb memory module:

DS415+ RAM Upgarde 020

Old and new module side-by-side:

DS415+ RAM Upgarde 021

Main board with new module installed:

DS415+ RAM Upgarde 025

Next we repeat the step in reverse order and here is DS415+ box assembled again:

DS415+ RAM Upgarde 026

And once it is powered up we can see the final result of this process:

After - System Information

Now I have 8 GB of memory in my NAS, have you noticed that my blog works faster now? 😉 All the process took about 1,5 hour from switching off till power on with replaced memory module.

facebooktwittergoogle_plusredditpinterestlinkedinmail

Removing K2 for SharePoint app

It has been a while since my last K2 realted post, but it is not because there is nothing to write about, it is just a bit difficult to allocate a time slot for writing. Honestly, with K2 set of technologies you not only have “marketing” promise of BYOA but indeed you do can create your own K2-based app fast and with very gentle learning curve, but at the same time when your work is to help various people on the different points of their “gentle learning curve” with K2, then this platform can give you a ride on a quite steep learning curve :) I mean there is ton of stuff to learn with array of use cases, design options and integration capabilities, sort of “fasten your belts” we going to move quickly type of scenario :)

One of the basic things which seems to be constant cause of confusion and support cases is related with correct uninstallation of K2 for SharePoint app from SharePoint 2013. Don’t get me wrong installation is important too, but once you installed app and started to create artifacts there are extra things to care about when you do need to uninstall your app for one or another reason. But let me elaborate on this and some related points in the following paragraphs.

First things first. SharePoint 2013 is different in terms of app development options, so K2 integration with SharePoint 2013 is also differs from what you have for SharePoint 2010. Important things there that now you have to deploy K2 for SharePoint app in your SharePoint 2013 site and all the management of K2 artifacts happens within SharePoint interface by means of pervasive K2 Application button readily available for you on ribbon:

Pervasive Application Button

This is your primary way of creating and deleting K2 artifacts in SharePoint 2013. Process of creating K2 artifacts in SharePoint 2013 called “appifying“: you use K2 application against SharePoint items to appify them. So here you have 3 key terms:

K2 artifacts – SharePoint 2013 based K2 SmartObjects. Appify (verb) – create SharePoint2013 SmartObject from SharePoint item (list, library etc.). And there is antonym of that, meaning that whatever you appified can be deappified.

So once you selected SharePoint 2013 item worth creating K2 SmartObject being used in K2 workflow you click on Application button and initiate appification process which looks approximately like this:

Appify1

Appify2

You may wonder why I dwell on such trivial things as this new terminology? Because I want you to be very clear on this specific point: Appification process do lead to creation of SmartObject which you will be able to see in Tester Tool but you SHOULD NEVER manage or delete SharePoint 2013 SmartObjects using Tester Tool, apart when you are in a mood for complex support case :) Once again deleting/editing SP 2013 SI in tester tool is not supported and will furnish you with troubles you don’t want to have.

I hope that passage on terminology will help you to memorize this. Now to antonyms :) Meaning why deappifying is important. As use of Tester Tool is not supported for managing SP 2013 service instance you have to deappify SharePoint items which had been exposed to K2 (read appified) before deleting such SharePoint items. If you fail to do this you will end up with such unwelcome guests as orphan SharePoint 2013 SmartObjects, which you don’t want to have in your environment. Deleting SharePoint item which had been appified? – De-appify it first! It is simple – click on that pervasive K2 Application button on a ribbon of object which you want to de-appify and delete created K2 artifacts:

Deappifying Single Item

Now to the topic of uninstall of K2 for SharePoint app from your SharePoint 2013 site. It should be clear by this point that it involves de-appifying off all appified items first. And knowing the way we treat documentation I will start from what not to do. Do not do this:

Wrong First Step

Now when you clear on what not to do, I can afford myself to add some details/explain why. Never do this Remove as a first step of K2 App uninstallation process unless you absolutely sure that no K2 artifacts have been created for this sate (nothing was appified) otherwise you will end up with orphan SmartObjects. Or to keep things simple never do this firs but learn correct process of removing K2 for SharePoint app which removes K2 artifacts (right order of steps is crucial here):

Step 1. Removing K2 artifacts from a SharePoint site. You can do this by means of Uninstall link under the General heading at K2 for SharePoint Settings page. To access K2 for SharePoint Settings page you either hover your mouse on K2 for SharePoint app icon and click on ellipses which appear in the top right corner of its tile, this will bring up pop out menu from which you have to click on SETTINGS (so 2 clicks involved here):

Accessing K2 for SharePoint Settings 1

One click method to access K2 for SharePoint Settings page is to click on K2 app logo/ inside of “black square”:

Accessing K2 for SharePoint Settings 2

From K2 for SharePoint Settings page you have to click on Uninstall link:

Uninstall

And as you can see from the warning you get this process will remove all K2 artifacts from the site:

Step 1 Uninstall

Step 2. Uninstalling the K2 for SharePoint components from the K2 environment. To accomplish this you run K2 for SharePoint Setup Manager from the Start menu and select the Remove K2 for SharePoint.

Why I describe this process in such details is because that Remove button unhappily filtered out in UI and it is very tempting to click on it :) The problem is that it won’t remove K2 artifacts and if you subsequently remove appified SharePoint items (there will be no way of deappifying them) you will end up with orhpan SmartObjects in your environment.

And of course it is documented nicely by K2 – Uninstall is described in Maintenance section of K2 for SharePoint Installation and Configuration Guide. But not all useful things filtered out as well as some dangerous UI buttons at times. So I hope these explanations may help at least someone / bring couple of little points to your attention before you run into issue/log a support case because of not doing your K2 for SharePoint app uninstall properly or because of plainly deleting appifyed SharePoint items without deappifying them first.

Of course there is more in-depth things to learn about integration between K2 and SharePoint. For example you may start from KB001707 K2 for SharePoint Component Compatibility and it is only beginning if you want to dive in into technical details, but as usual worth of investing your time before you invested heavily in building your solution without doing your homework on compatibility and supportability.

facebooktwittergoogle_plusredditpinterestlinkedinmail

How to verify AD DS FFL/DFL and how to rollback to a lower levels

Microsoft Active Directory Services evolves with each edition of Windows Server and whenever you do initial install of AD DS or upgrade your DC servers there is an option/decision for you to set/upgrade Domain and Forest functional levels (I will further refer those ad DFL and FFL respectively). Each new level introduces some new features and starting from 2008 R2 there is an option to do a rollback to lower levels. This is possible only if you not enabled certain features which require current FFL/DFL (think of AD recycle bin etc.).

In short it means that from Server 2012 R2 DFL/FFL you can rollback as far down as to Server 2008 (certain limitation are applicable here). In the following TechNet article: “Understanding Active Directory Domain Services (AD DS) Functional Levels” you can find details on features available on each functional level as well as neat table describing possible rollback options. In this blog post I want to describe how you can check your current FFL/DFL and perform rollback to lower level.

Most of the features are included in DFL, and those may have FFL requirements. To give you an idea of which features are become available with newer DFL:

Server 2008: DFS replication support for WS 2003 SYSVOL, Domain-based DFS namespaces (includes support for access-based enumeration and increased scalability), AES128/256 for Kerberos, Last Interactive Logon Information, Fine-grained password policies, Personal Virtual Desktops.

Seriver 2008 R2: Authentication mechanism assurance (packages info about logon method type into Kerberos token), automatic SPN management for services running on specific computer under the context of Managed Service Account when the name of machine account changes

Server 2008 R2 FFL: Active Directory recycle bin

Server 2012: KDC support for claims, compound authentication and Kerberos armoring

Server 2012 R2: DC-side protections for Protected Users, Authentication Pilicies, Authentication Policy Silos

Now to the practical part of this blog post. How to check current FFL/DFL:

1) GUI way. Access “Active Directory Domains and Trusts” snap-in and right click on your domain to access its properties. On General tab you will be able to see DFL/FFL:

AD DS DFL-FFL level 012) PowerShell mehod 1. This method does not require admin or domain admin rights, and can be used even with a limited user on a domain-joined workstation with Powershell v2/v3 or newer. It also does not require any third party tools or Powershell modules (e.g. Microsoft Powershell AD Module).

This script will return 3 numberic values which can be interpreted using the following table:

3) PowerShell method 2. Requires Microsoft PowerShell AD Module.

This script gives easily readable output instead of numbers you get with method (2).

How to rollback to lower FFL/DFL:

While you can raise FFL/DFL via GUI using  “Active Directory Domains and Trusts” snap-in you cannot lower it using GUI and have to use PowerShell for this. Here is command you have to use:

Order is important: first forest, then domain. Depending on your current DFL/FFL valuee for ForestMode/DomainMode parameters include the following: Windows 2012, Windows 2008R2, Windows2008.

As possibility to lower FFL/DFL was introduced in Server 2008 R2 you cannot go lower than Server 2008. So in case for your tests you need something like Server 2003 or Windows 2000 in terms of FFL/DFL you have to install AD DS from scratch.

facebooktwittergoogle_plusredditpinterestlinkedinmail
End of support for old IE versions

Older versions of IE no longer supported by MSFT starting from 12.01.2016

Microsoft announced that it ends support for old versions of IE 12.01.2016, this means that:

– The only Microsoft supported browser starting from this date is IE11, and only it will continue to receive security updates, compatibility fixes, and technical support on Windows 7, Windows 8.1, and Windows 10.

– IE 8/9/10 are no longer supported, i.e. they will work but given the fact that there will be no security patches or other fixes will be provided using this in enterprise (or IMO at home also) doesn’t seem to be a good idea.

Microsoft communicated that this change will take place about an year ago, but as usual some companies will be unprepared for this, as it was the case for end of support for Windows XP, for example (US NAVY paid $9m to MSFT and still(!) continues to receive support for XP). See related article on techrepublic.com: Internet Explorer: How Microsoft scaling back support is leaving big orgs playing catchup.

The only problem here is that some older versions of Windows can’t upgrade to this browser but those versions of Windows itself reached end of support. Though there some intrepid enterprises who not going to do away with XP, I think it is inevitable as it really becomes not cost effective anymore how strongly you don’t want to pay for upgrade or avoid “pain” of migration.

I guess MSFT tries to focus more on quality and speed of their browser, as backward compatibility burden goes largely under-appreciated by general public which tends to criticize MSFT browser performance severely just not realizing that it is an iceberg with its largest part heavily shaped by backward compatibility burden is hidden under water and affect it agility. So MSFT kind of doing great job for their Enterprise customers but in the end everybody displeased by performance and other stuff doing their comparison with no backward-compatibility burden whatsoever which roll-out updates in high-frequency DevOps fashion.

In case you are using K2 smartforms it is also affect you as K2 compatibility matrix going to reflect that. It doesn’t mean that K2 smartforms suddenly stop working in old browsers, but it does mean that K2 also stop to release fixes or patches for these older versions of Internet Explorer. I.e. K2 support still going to assist you with troubleshooting issues you may face on older versions of IE and helping with finding possible workarounds to those, but no fixes or patches will be built for newly found bugs – in such cases you will be required to upgrade to IE11.

Those changes from K2 side are driven by Microsoft support policy change as K2 works on top of Microsoft technologies and tend to focus on quality and build components for supported/current versions of Microsoft technology stack components.

Please refer to official K2 Technical Bulletin communicating this change. You may see that compatibility matrix for K2 smartforms is also updated and both Design and Runtime Browser sections get new footnotes for IE 8/9/10:

End of support for old IE versions

P.S. But we may also see some backlash and corrections from MSFT side maybe. It seems that there was something similar with decisions like EoL for InfoPath or releasing SharePoint as cloud only product which were reconsidered… Though I think with IE it is more justified for MSFT to stick to this decision.

facebooktwittergoogle_plusredditpinterestlinkedinmail

K2 blackpearl Installation: Configuring MSDTC properties

Disclaimer: in case you have no difficulties to say how to access Local DTC properties on Windows machine without googling or trying hard to remember it then you don’t need to read this post. :)

I was building retro test environment with K2 4.6.2 installed and run into the following warning raised by K2 Setup Manager – “MSDTC Network Access options not set correctly”:

MSDTC Network Access 1 Warning

K2 Setup Manager quite explicitly tells you about what you have to do, detailing up to the level of exact check-boxes you have to have checked. Unfortunately 4.6.2 can’t repair this without your intervention (I assume recent versions too, bun need to double check this). But anyhow it is not difficult to guess that correct settings supposed to look like that:

MSDTC Network Access 2 Correct Settings

Once required check-boxes in place this warning resolved, just click “Analyze” after you have done these changes:

MSDTC Network Access 3 Warning Cleared

Looks straightforward enough, but what may be a question/issue here is that how to access these “Local DTC properties”? (maybe there is an answer in help hyperlink mentioned by Setup Manage in the same warning?)

Anyhow without looking/googling I keep forgetting how to access this dialog, so decided to jot this down. You have to run the Component Services MMC snap-in:

MSDTC Network Access 4 Component Services MSC

In this MMC snap-in you just locate “Local DTC” and select its properties from context menu. And to invoke this you just run this:

or

In case location matters for you it is located under: %SystemRoot%\System32

facebooktwittergoogle_plusredditpinterestlinkedinmail

Checking for Windows update presence via command line

Just a quick note on how to check if some specific Windows update is installed on the system via command prompt. I thought I already took a note of this, but was not able to find existing post on this. While investigating an issue or doing troubleshooting something more often than not you may need to make sure if some specific update is present in the system or not. And not always you may want to wade through GUI to check this. Let’s say you may want to automate this process or do some mass discovery by running script on multiple machines.

Option 1. Query WMI (Windows Management Instrumentation) namespace.

1-A. Using wmic.exe (powerful, user-friendly CLI to the WMI namespace):

1-B. Using PowerShell:

Option 2. Using PS commandlet get-hotfix which was introduced in PS 2.0:

So this should be enough for most of the cases.

facebooktwittergoogle_plusredditpinterestlinkedinmail
NAS ALB tests result

Synology DS415+ Adaptive Load Balancing

UPDATE: I had a discussion with Ulrik D. Hansen, developer of NAS performance tester and thanks to some input from his side realized that test results described in this blog post are not correct, but I will leave this post as is and write another one with new tests and some explanations soon.

It is something like on year since I’m using Synology DS415+ NAS. This particular model has 2 Gigabit Ethernet ports with support for network load balancing and fault tolerance options, i.e. allow you to do link aggregation. This feature was one of the important things when I selected this particular model. Unfortunately when I tried this I realized that I cannot do load balancing as it required support for 802.3ad LACP from network switch. After quick research I found out that SOHO WiFi access-points doesn’t have advanced switching capabilities and normally limited to 4 Ethernet ports, unless you are ready to try to do some unsupported tweaks with alternative firmware. Unfortunately this is the case even for newest models of premium segment, including even one I’m aiming to buy soon (ASUS RT-AC5300). The thing is that they normally try to max out/beef up device wireless capabilities and feature keeping built in switching capabilities simple – and I can understand it, as wired connection is always way faster than fastest possible WiFi and by no means is primary feature of WiFi access points. But buying more advanced switch makes sense anyway, but I will cover this later.

So because of need for buying new switch I abandoned idea of using link aggregation with my Synology DS415+ as something I can’t do with equipment I have… And to be honest I somewhat missed the fact that in DSM 5.2 update Synology added support for two new modes when creating a bond interface: Adaptive Load Balancing and Balance XOR (see Synology DSM release notes). Only recently I tested this and indeed it works: you just create network bond and enable Adaptive Load Balancing option – nothing is required on your switch side:

NAS Enable Adaptive Load Balancing

Once enabled your network bond properties will drastically improve your mood if you are person who cares about performance ratings & large figures… :) It’s a shame that we no longer have performance index in recent Windows releases (starting from Windows 8) and 3DMark tests fall out of fashion – you may argue about relevance of indexes/numbers and synthetic tests but we can’t argue that it was fun/enjoyable to have these things :) Sad to see that we sort of moving away from universal bench-marking software, but it seems that the state of hardware ecosystem makes many of these tests irrelevant.

NAS ALB Bond properties

I looked around the net for feedback whether enabling this thing gives you real advantage or not & found a lot of pessimistic stuff like you never ever going to leverage this because of XYZ or because of the fact that XYZ will be a narrow place etc. So people really confused if there is some real gains from using this. Common sense answer to that – it all depends (on specific workload/scenario you are interested in). But whenever performance subject is raised for discussion, baseline is everything – luckily enough I did some basic performance tests without Adaptive Load Balancing (NAS connected over single Gigabit Ethernet port) some time ago. So now I can do at least some comparison. Of course this is not super clean comparison: I have newer DSM version on NAS (which may contain some improvements) and my target volume if formatted as ReFS – so those are differences. But still improvement way beyond any test inaccuracy as you will see below.

Test 1. NAS performance tester using 400 MB file with both header and data digests disabled.

Single gigabit Ethernet link:

Two gigabit Ethernet links with Adaptive Load Balancing enabled:

NAS Test 400MB

Test 2. NAS performance tester using 8000 MB file with both header and data digests disabled.

Single gigabit Ethernet link:

Two gigabit Ethernet links with Adaptive Load Balancing enabled:

NAS Test 8000MB

So to sum this up, we see real performance increase here:

NAS ALB tests result

So despite the fact that laptop from which I run this test connected only over single Gigabit LAN port we see substantial performance gains. But what if instead of plain simple switch I have built-in into my WiFi access point we add big managed switch? So I’m expecting to get one on next week with switching capacity 36 GB/s (26.79 GB/s max forwarding rate) and do some more tests – the same Adaptive Load Balancing config while connected through this dedicated switch and later another test of 802.3ad link aggregation mode (switch I’m going to get is D-Ling DGS-1100-18 from DGS-1100 Series of Gigabit Smart Switches). Stay tuned if you interested to see tests results.

facebooktwittergoogle_plusredditpinterestlinkedinmail

Switching SP2010 from Classic Mode to Claims Mode Authentication

SharePoint Server 2013 uses claims-based authentication as its default authentication model, and it is required to enable its advanced functionality. Using claims-based authentication has the following advantages over using Windows classic-mode authentication:

  • External SharePoint apps support. App authentication and server-to-server authentication rely on claims-based authentication. With Windows classic-mode authentication you are unable to use external SharePoint apps. You also cannot use any services that rely on a trust relationship between SharePoint and other server platforms, such as Office Web Apps Server 2013, Exchange Server 2013, and Lync Server 2013.
  • Claims delegation without “double-hop” limitation. SharePoint can delegate claims identities to back-end services, regardless of the sign-in method. E.g., suppose your users are authenticated by NTLM authentication. NTLM has a well-known “double-hop” limitation, which means that a service such as SharePoint cannot impersonate the user to access other resources on behalf of the user, such as SQL Server databases or web services. When you use claims-mode authentication, SharePoint can use the claims-based identity token to access resources on behalf of the user.
  • Multiple authentication providers per one web application. When you create a web application in claims-based authentication mode, you can associate multiple authentication providers with the web application. It means, that, for example, you can support Windows-based sign in and forms-based sign in without creating additional IIS websites and extending your web application to additional zones.
  • Open standards. Claims-based authentication is based on open web standards and is supported by a broad range of platforms and services

There are several supported scenarios for migrating or converting from classic mode to claims mode authentication which performed with use of a number of Windows PowerShell cmdlets: you either switch your web apps on SP2010 before upgrade to SP2013 or you can convert SharePoint Server 2010 classic-mode web applications to SharePoint Server 2013 claims-mode web applications after you have SP2013 installed already.

Steps to switch your SP2010 web apps to claims based authentication:

1. Enable claims authentication for your web app.

2. Configure the policy to provide the user with full access.

3. Perform the migration.

4. Provision claims

Once done you with these changes may verify that you are using Claims Authentication for your web application:

GUI way. In Central Administration navigate to web application management, select your Web Application and click on Authentication Providers button:

SP 2010 check web app authentication mode 01

It will open a window where you can verify your default authentication mode:

SP 2010 check web app authentication mode 02

 PowerShell way:

It will return True or False depending on whether you have Claims Authentication enabled or not (screenshot below for enabled state):

SP 2010 check web app authentication mode 03

In case you have K2 components installed you may need to perform relevant configuration changes on K2 side (see Claims Authentication Configuration section at help.k2.com) which I will cover in separate blog post.

In case if you are in a mood for deep dive into what & why of claims authentication subject you may read through the following articles:

Identity (Management) Crisis (Part 1): The evolution of identity concepts

Identity (Management) Crisis (Part 2): Everything you (think you) know is wrong

Identity (Management) Crisis (Part 3): Solving the Identity Problem

Identity (Management) Crisis (Part 4): Selecting a Comprehensive Identity Management solution

Claims Based Identity: What does it Mean to You? (Part 1)

Claims Based Identity: What does it Mean to You? (Part 2)

Claims Based Identity: What does it Mean to You? (Part 3)

facebooktwittergoogle_plusredditpinterestlinkedinmail

Category specific RSS feeds added

I’ve just installed WordPress plugin which allows use of category specific RSS feeds on this blog (this one). It means that if you interested in some new posts which appear here, but only on specific topic you may use category specific RSS feed as opposed to general feed which contains all sorts of posts on various topics. For example you may want use K2 or Tech or Language Learning category feeds if you interested in some specific topics only (which most likely in the case). I also placed K2 category RSS feed link on a side bar for convenience.

facebooktwittergoogle_plusredditpinterestlinkedinmail

Preparing to revisit 70-689 exam again

It has been a while since I wrote my previous blog post in general and quite a while since I wrote any blog post related to MSFT 70-689 exam, so now it is high time to do this. :)

I spend last week attending SharePoint 2013 training (to be more specific it was course 20332 Advanced Solutions of Microsoft® SharePoint® Server 2013) and you maybe surprised why I’m still bother about Windows 8.1 instead of working towards SharePoint 2013 certifications. Well SharePoint certifications as well as some related blog posts on mo to do list, no worries. To be honest I don’t have that sort of neat to do list, but rather loads of scattered notes and vague intentions and ideas, but I working towards the points and targets floating in this fluid list :) On that list, apart from Windows 8.1 exam, there are some K2 and SharePoint related blog posts as well as re-take of K2 certifications and even more stuff…

Anyhow I failed 70-689 exam back in the spring (even twice – in April and June) and publicly announced that I will make it sooner or later, so now I’m announcing this again :) I recently done listening to Rewire by Richard O’Connor, and according to qualified professional’s opinion, making public announcements about your targets makes you stick to them better, as it makes your failure conspicuous and a bit more painful, so I’m going to use this technique here… In case anybody would interpret this behavior as a sort of compulsive idea to do something I would say no to that, as I really made (though still working on this skill) a good progress on giving up/leave some projects and targets which I reconsidered as irrelevant to me either totally or at this point. If you wan an example: I used to be a PhD Economics student, but dropped this in the middle after passing some exams and writing some early drafts of that big work you supposed to write as a PhD student, but I’m perfectly OK with the fact that I removed this target from my to do list (it also doesn’t prevent me from enjoying good Economics books and I never regret about investing my resources into that attempt).

Anyhow I finally started reading “MCSA Microsoft Windows 8.1 Complete Study Guide: Exams 70-687, 70-688, and 70-689″ which I’ve only skimmed before attempting 70-689 exam back in the spring and going to schedule my exam as soon as I done reading and will be getting 100% with MeasureUP practice test. And for those who can’t imaging blog post without pictures:

Ready to rivisit 70-689

Revisiting assessment test from the beginning of aforementioned book – 30 out of 40, not the best result but… As you can see at times it is possible to do your learning at coffee shop with help of napkins :) If you look at the picture really carefully you may read funny answer B to question 33:

Q: Your computer’s application has stopped responding. What should you do?

One of the potential answers: “Start backing up like crazy”

I really appreciate authors trying to inject some fun into dry materials, but a bit concerned about people selecting this as a right answer :)

P.S. For those who think this exam is too easy for experienced IT Pros I would suggest not to use dumps and also read through the torrents of laments of experienced folks in comments below this blog post – in short people really find that recent Microsoft exams: had too much out of scope stuff injected, excessively broad for primary technology/topic, require you to memorize ton of trivia, know the right answer prescribed by MSFT and in general “it seems that somebody on the top at MSFT said let’s make exams harder” :) But really it’s not a problem for those who really wants to pass this exam, right? 😉

facebooktwittergoogle_plusredditpinterestlinkedinmail