Sunday, March 22, 2009

Return from functions in PowerShell

I wrote 2 blog posts about returning values from functions (here and here) in past, however as I just noticed I haven’t posted any solutions there.

There are in fact at least 3 solutions to this problem.

1.) First was already mentioned – don’t use function as function, however as subroutine and assign value back to variable provided by argument.

2.) Second approach is that you can always consider function to return array and only take last element:

$MyVar = [array]$(Function) | Select –Last 1

That will cast output from function to array (even if it has only single member) and then return last element of that array.

3.) Another solution provided by Oisin is to redirect whole script block to null:

“If you have no clue about/control over whether things may or may not emit and/or are just a bit lazy, dot source a script block redirected to $null. Dot sourcing it ensures it is evaluated in root function scope, and also throws all output away.”

function Test {

    . {
        1, 2, 3
    } > $null

    return 4,5,6
}

I like Oisin’s approach most – it is not hard to implement and works really nicely.

 

One warning though – you shouldn’t take this as general solution and it is not recommended. This approach is mostly useful in case you execute some functions from external source (like invoke from XML file) or you have very complex functions with tons of unexpected returns (especially handling with XML files).

Wednesday, March 18, 2009

Final naming convention

This is follow up to my previous post about naming convention in PowerShell and especially need for namespaces.

Oisin shared some details about modules for PowerShell v2 – and these are changes I was looking for.

When you load function from module in Posh V2, you can access it though module. Consider example where my module is called Citrix and contains single function Get-Server.

If there is only single Get-Server function, I can use it directly, however if there are multiple modules with same function name, I must use syntax <Module>\<Verb>-<Noun>.

Our above example is therefore translated to Citrix\Get-Server.

This is naming convention I could use – all I need to do now is to simply change delimiter from “:” to “\” and that’s it – I am ready for Posh V2 ;)

Problem is that I have scripts that acts as containers for functions – similar way how Posh V2 works. Fixing these modules and their function is simple, however it will get much more complicated once I will try to fix different scripts that are using these functions.

For that, I will use following workaround in Posh V1: each function name will be <Module>\<Verb>-<Noun>. So for example my function declaration will looks like this:

Function Global:Citrix\Get-Server {…}

That way once Posh V2 will be released and I will decide that I want to use modules, I only need to change it to Function Get-Server {…} and I will get same user experience as with V1. Simple and not confusing for end users. Functions will have same names in V1 and V2 – and therefore I am prepared for very easy migration to V2 in future.

Tuesday, March 17, 2009

Powershell naming convention

Recently I posted about “smart” function names I am using – and we had great discussion about it :) I promised I will write some follow up afterwards and here it comes :)

First of all, I would like to say that I BREAK naming convention of PowerShell and I am fully aware of it, however in my situation positives are outweighs negatives. Because number of people that are not following naming convention is growing, I think there is chance there are some small gaps in PS design that leads to this situation.

First let’s discuss what is naming convention for PowerShell functions:

<Verb>-<Noun>

Good and simple example is New-Object. However what if you need to be more specific? For example what if you want to create new object in SCCM?

Then official way to do so would be New-SCCMObject.

According to official naming convention, retrieving all SCCM related functions can be achieved by calling Dir Function:*-SCCM*.

You can find Microsoft document about PowerShell naming convention here and I highly recommend reading it.

You shouldn’t change verb, only noun.

 

At least based on my experiences, new comers to PowerShell have problems to understand this – most time I saw first scripts from someone, they almost always started with SCCM-* (or any other technology-based prefix). Maybe it’s caused by fact that flexibility to noun is not stressed enough, I think it is caused by missing element of naming convention – ability to specify script category or technology.

Most time I heard some complains about any changes to naming conventions, arguments were that I am breaking existing naming convention. It is true – however that doesn’t say anything about fact that maybe current naming convention should be tweaked little bit to be more flexible – at least due to fact that popularity of PowerShell is growing and especially with modules from Posh V2 exchangeability of PowerShell will be even improved.

My personal feeling is that once someone will decide that existing naming convention doesn’t fit his requirements (for any reason), field is completely opened for him and he can do whatever he wants. Most of the time result is technology-based prefix, however sometimes we can see absurdities like the ones from Citrix ;)

Another common argument is that administrators will have problems with accepting longer function names – but that’s why we have aliases ;)

Another reason why I don’t like always current naming conventions is automation – there is no delimiter between script category and script itself (noun + any extension is single element). To give you example – I am currently using <NameSpace>:<Verb>-<Noun> syntax. If there will be someone that doesn’t like it and would like to switch to default <Verb>-<Noun> syntax, he can use simple function that will just do <Verb>-<NameSpace><Noun> – for example “translate” function Citrix:Get-Farm to Get-CitrixFarm. This is possible due to fact that there is delimiter between namespace (I know it is NOT a namespace, but I like to use that name ;)) and rest of function name. It won’t work other way around – there is no chance how to detect that “Citrix” in “CitrixFarm” is category. Another example is Dmitry’s post called “We need namespaces!” – IF there would be clear delimiter between namespace and function name, you could easily write “using” function yourself - either using same example as Dmitry provides or even better, using script block:

Use-NameSpace –NameSpace Quest.ActiveRoles.ADManagement –Process {

Get-User “Dmitry Sotnikov” | Set-User –city “St. Petersburg”

}

You don’t need to extend PowerShell itself to support this – just write few simple functions and apply consistent naming convention.

 

What I am missing is some industry standard – which delimiter should be used? Should it be “:” as I do, should it be “\” as Dmitry suggests or should it be “::” as in Kirk’s example? I don’t really mind – as long as there is something that everyone will follow, I would be more than happy to change my scripts according to this standard.

 

There are few reasons why I like to use category prefixes:

  • It’s easier to understand for both administrators and developers. For developer, category prefix = namespace. For administrator, category prefix = toolbox. So Citrix:Get-Farm can be translated as “Use Get-Farm function from Citrix: toolbox”. If there is any reader that used my naming conventions, please share your experiences.
  • Such function names supports further automation – see above examples (translating to regular Posh naming convention, implementing “using” functionality etc etc…).
  • It is easier for vendors to introduce functions\cmdlets. Common example is virtualization field – once first vendor will use Get-VM as cmdlet name, rest of them needs to workaround it somehow. Releasing cmdlets\functions and sharing scripts would be much simpler if there would be support for namespaces.
  • In my situation, I prefer decentralized development. Many people are writing scripts for single framework and I don’t want to control what they are doing and how they name their functions – now all I need to do is assign prefix to them (you work on SCCM, so you will use SCCM: prefix, someone else works on Altiris, so he will get Altiris: prefix). That way management of bigger projects\frameworks is much easier.
  • Using consistent function names – I know it sounds strange, but for example I  have function that will write to event log as part of my framework. What is logical name? Write-EventLog. However that is already reserved by PowerShell itself – so I just specify S4M as prefix (that’s name of my framework), so function name will be S4M:Write-EventLog.

 

Even Jeffrey agrees that this is important to solve somehow. What I am missing at this moment is some kind of naming conventions reservation – I don’t mind how long it will take Microsoft to implement it, however I would like to know how it will looks like, so I could already implement same naming scheme (before it will be too late). There was recent blog post about reservation of keywords – and that’s what I would like to get. Many years ago I started calling my cmd scripts using powershell Verb-Noun syntax, just to get people I worked with used to that logic. Now I would like to do the same – start implementing technologies that will come in future already and prepare people for them.

If you get my point, I highly recommend you vote for Connect feature request We need namespaces from Dmitry Sotnikov.

UPDATE: Read follow up here

Monday, March 16, 2009

Cloud 3 – applications and tools

Here comes another part of my cloud – this one is not very interesting, because there is nothing special in it, however it is tightly related to next one – keeping installation sources automatically updated.

I keep all my installation sources in Live Mesh divided to three parts:

  • PrivateSources (like Office, Visio…) – this one is accessible only by me and is not shared with anyone
  • Sources – installation sources, here I store publicly available applications
  • Tools – applications that doesn’t require any installation

Live Mesh allows me to automatically synchronize applications between multiple computers – major benefit is definitely Tools, because installation is not needed and I can use it immediately. I know I could run into “Out of space” pretty soon (due to Live Mesh limitation to 5GB), however I don’t use that many applications, so for me it is perfect ;)

What is interesting about this is how I keep all sources automatically up to date and synchronized between multiple computers – but for that, you will need to wait for next blog post ;)

Thursday, March 12, 2009

Smart function names in PowerShell

As you all know, official syntax of PowerShell is <verb>-<noun>. For example Write-Script.

However this is not always the case – there are many functions that doesn’t follow this naming convention and reason is usually that function name is already taken or author wants to make obvious to which technology is function related.

You get afterwards function names like SCCMImport-Server or SCCM-ImportServer.

I prefer to use “functionality scopes” (my term, sorry, couldn’t come with something better ;)).

Consider example where I want to create new object in Citrix. Logically I would like to call function New-Object, however that was already taken by PowerShell team ;)

Typically I will create function with name Citrix-NewObject or better New-CitrixObject.

What I can do instead is that I can prefix verb-noun:

Citrix:New-Object

It is much easier to use and read, also allows you to use simple dumps (Dir Function:Citrix:* to get all Citrix-related functions).

Cloud 2 – synchronize files and folders

Here comes second part of posts about my cloud – synchronizing files and folders between multiple computers.

Currently I am using Live Mesh – this is cloud computing from Microsoft itself. You will get 5GB for free and online desktop (AJAX based of course). There is also beta client for Windows Mobile available. Live Mesh is simply automated synchronization between multiple computers – and one of “computers” is your online desktop. New feature is that you can join (if you are invited) folder of someone else – so Live Mesh is not only synchronization between your computers, however also synchronization between different people.

I still don’t understand Microsoft strategy – and I think it is some kind of fight for power that there are few different technologies from Live that are very, very similar: Live Mesh, Live Sync\FolderShare, SkyDrive… Each of them have little bit different set of features – SkyDrive gives you most space (25GB vs 5GB), Live Mesh has online desktop, Live Sync has ability to browse remote files… Joining them would make ultimate cloud experience :(

Even though there are some nice ideas, it still contains too many issues – or lets say missing features.

What I want to synchronize between my computers (and have available online)?

Of course documents – that is normal Live Mesh usage, together with photos. However I also want to synchronize more dynamic stuff like Favorites – and Mesh doesn’t really work with this type of data.

Where is problem? Internet Explorer is periodically overwriting existing .lnk files for some reason and Mesh doesn’t have any advanced conflict handling. You will see conflicts once you open folder:

image

There is however no central overview of all conflicts. So if you think you got conflict, you must go through all your folders :(

So once you will start synchronizing Favorites folder, synchronization will stop after few minutes. There is no way how to setup conflict handling in Live Mesh – for Favorites, I would like to specify that latest file should always win.

I am also using Live Mesh to synchronize my tools and utilities – and they are automatically updated (wait for Cloud 3 ;)).

Anyway, there are too many features that are missing that I can almost feel potential of Live Mesh. Here is my wish list:

  • Automated conflict resolution or central conflicts dashboard
  • Browse remote folders without need to subscribe
  • Ability to connect to remote computer without requiring confirmation
  • Ability to browse remote computer (remember FolderShare?)

Tuesday, March 3, 2009

How could 2019 looks like? ;)

Sometimes I feel really sorry how different Microsoft visions and they realizations are – and it happened to me again today :(

I just watched this really great video: http://www.istartedsomething.com/20090228/microsoft-office-labs-vision-2019-video/

Highly recommended, at least we must admin that UI is really nice and clean ;)

UPDATE: Another interesting vision, this time from Sun Microsystems (1993) ;)