I recently had a support case thanks to which I discovered rather cool way of checking out on big files in specific directory which I will describe later here.
Under certain conditions you may see the following issue in K2: very high CPU usage and by extension overall sluggishness of K2 applications accompanied with “System.IO.IOException : The requested operation could not be completed due to a file system limitation.”
As in most of the cases error message itself indicates what is wrong here and “The requested operation could not be completed due to a file system limitation” should ring a bell for you that some file or files run amok and growth beyond file system limits or something along these lines. If you read your logs even more closely they may even give away specific culprit to you indicating log file name which is responsible for this.
K2 has broad logging capabilities for monitoring and troubleshooting purposes (quite good overview of K2 logging can be found here) but in terms of logging volume main suspects are: SmO logging (the only logging which can’t be capped in terms of file size), ADUM logs (very voluminous, especially on debug logging level; file size can be limited by adjusting configurable settings, meaning that you have to go extra mile if you want to allow unhealthy big file name) and lastly debug assemblies you may receive from K2 support. Debug assemblies usually are quickly build ad-hoc troubleshooting tools to investigate specific issue and may well not have log file limit and write super detailed logging (=voluminous log files). As such those supposed to be removed upon completion of your troubleshooting effort, but in reality can be left applied for a while which gradually evolves into forever…
Anyhow exception “System.IO.IOException : The requested operation could not be completed due to a file system limitation.” in K2 host server log in most of the cases caused by abnormally high in size log file, which becomes so big that it exceeds RAM size which makes it difficult to open it and append for writing, and then you have that slippery slope situation with degraded performance and high CPU moment, and to that “aha, I forgot to disable/remove unneeded logging” moment.
Now my take away from this case (though what is said above also worth noting). How to quickly check on huge files in specific directory. Just use this PS script:
Get-ChildItem -Path 'C:\Program Files (x86)\K2 blackpearl' -Recurse -Force -File | Select-Object -Property FullName,@{Name='SizeGB';Expression={$_.Length / 1GB}},@{Name='SizeMB';Expression={$_.Length / 1MB}} | Sort-Object { $_.SizeGB } -Descending | Out-GridView
You may add “-First 10” parameter right after Select-Object in the script above to minimize output which is especially useful when you primarily interested to identify largest file or files.
Here is how the result for healthy K2 folder looks like (by healthy I mean one without strangely big log files):
As you can see normally you should not have anything with size of 1 gigabyte more, but above mentioned exception is usually caused by 10-20 GB log file which will be featured prominently on the top of the output.
See also related K2 community KB: Exception – The requested operation could not be completed due to a file system limitation.