Buy Me a Coffee

Buy Me a Coffee!

Tuesday, January 31, 2017

Security and the PnP.Core template

If you are using a PnP.Core template to provision the structure for a site, you will also want to provision the security with the template.  You can build out new security groups by adding the following node as a direct child of the ProvisioningTemplate:


      <pnp:Security>
        <pnp:SiteGroups>
          <pnp:SiteGroup Title="SharePointGroupName" 
              Description="Sample SharePoint Group" 
              Owner="i:0#.w|domain\user" 
              AllowMembersEditMembership="true" 
              AllowRequestToJoinLeave="false" 
              AutoAcceptRequestToJoinLeave="false" 
              OnlyAllowMembersViewMembership="false">
            <pnp:Members>
              <pnp:User Name="i:0#.w|domain\user" />
            </pnp:Members>
          </pnp:SiteGroup>
        </pnp:SiteGroups>
      </pnp:Security>


Then, on your lists you just add the following as a direct child of the ListInstance node:


          <pnp:Security>
            <pnp:BreakRoleInheritance 
                  CopyRoleAssignments="false" 
                  ClearSubscopes="false">
              <pnp:RoleAssignment 
                    Principal="SharePointGroupName" 
                    RoleDefinition="Full Control" />
            </pnp:BreakRoleInheritance>
          </pnp:Security>

It really is as easy as that.

The Version of Visual Studio you are running matters

Let's pretend for a second that you and a colleague are about to start a project together.  Let's also pretend that you are going to be using C# to develop the project.  You get together and poll your other colleagues and determine that .NET Framework 4.5.2 is the only way to go.  You are done now, right?  Anything else to discuss?  Ok, you will use GIT for source control.  Enough?  Tabs and not spaces, no regions.  Surely you are done now, right?  What about Visual Studio version?  Not Edition, Version.  You should, otherwise your friend might not be able to compile you write.
Wait, doesn't the .NET Framework version dictate the language features you have available?  Do you doubt me?  Open up Visual Studio 2013 and try adding the following code to some class:


        public static string ExampleProperty { get; set; } = "Yowza!";

What do you get?  I get:
    Invalid token '=' in class, struct, or interface member declaration

Syntax error on Property initialization in VS 2013
Ok, maybe I am just writing bad code?  Try the same thing in VS 2015.  Works doesn't it?  Are you getting the picture?  Do I have your attention yet?
Well, what happens when we decompile the assembly created by VS 2015?  We get this:


using System;
using System.Runtime.CompilerServices;
 
namespace BlogPost
{
    internal class Program
    {
        public static string ExampleProperty
        {
            get;
            set;
        }
 
        static Program()
        {
            Program.ExampleProperty = "Yowza!";
        }
 
        public Program()
        {
        }
 
        private static void Main(string[] args)
        {
            Console.WriteLine(Program.ExampleProperty);
        }
    }
}

Look familiar?  What would happen if we already had a static Program() method?  Let's try and find out!

If we start with this:


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
 
namespace BlogPost
{
    class Program
    {
        public static string ExampleProperty { get; set; } = "Yowza!";
 
        public static string ExampleVariable;
        static Program()
        {
            ExampleVariable = "wow";
        }
 
        static void Main(string[] args)
        {
            Console.WriteLine(ExampleProperty);
            Console.WriteLine(ExampleVariable);
        }
    }
}

We end up with this when we decompile:


using System;
using System.Runtime.CompilerServices;
 
namespace BlogPost
{
    internal class Program
    {
        public static string ExampleVariable;
 
        public static string ExampleProperty
        {
            get;
            set;
        }
 
        static Program()
        {
            Program.ExampleProperty = "Yowza!";
            Program.ExampleVariable = "wow";
        }
 
        public Program()
        {
        }
 
        private static void Main(string[] args)
        {
            Console.WriteLine(Program.ExampleProperty);
            Console.WriteLine(Program.ExampleVariable);
        }
    }
}

Interesting isn't it?  That is part of why I am looking into the internals of the binary files produced by different frameworks and different compilers.  Don't get upset when your Dev Lead or Architect won't let you move to the latest and greatest compiler.  There is a reason why standards are important in all things.

Keep having fun, and asking questions!

Sunday, January 29, 2017

Using command Line Verbs with the Beta version of the Command Line Parsing Library

A very nice feature of the Command Line Parser Library is the verb option.  It keeps you from having to throw in a bunch of Boolean flags and code flow control.  Verbs allow you (really, force you) to keep your arguments for each action in a separate Options object.  To get to use verbs, you will need to install the beta version from NuGet (or pull the source if you are really adventurous):
Command Line Parser Library Beta NuGet package
Once you do, a whole new world opens up for you.  You can now create verb option objects that isolate specific arguments to those actions.  To create a verb option object, you simply decorate a class with the Verb attribute, passing in the command line keyword and the HelpText.  It really is as simple as that.  Never one to leave well enough alone, I also played around with using a GlobalOptions object that each of the specific verb objects inherits from.  That works as expected, and allows you to consolidate common arguments in a single location.

Without further adieu, here are my three test options objects:


    class GlobalOptions
    {
        [Option('v', "verbose", Default = false,
          HelpText = "Prints all messages to standard output.")]
        public bool Verbose { get; set; }
    }


    [Verb("add", HelpText = "Add files to the site.")]
    class AddOptions : GlobalOptions
    {
        [Option('s', "source", Default = ".",
          HelpText = "The source directory for the files to process.")]
        public string Source { get; set; }
    }


    [Verb("show", HelpText = "Display the contents of a file.")]
    class ShowOptions : GlobalOptions
    {
        [Option('f', "file", Default = ".",
          HelpText = "The file to show.")]
        public string File { get; set; }
    }

And here is the main program:


        static int Main(string[] args)
        {
            return Parser.Default.ParseArguments<AddOptions, ShowOptions>(args)
              .MapResult(
                (AddOptions opts) => RunAddAndReturnExitCode(opts),
                (ShowOptions opts) => RunShowAndReturnExitCode(opts),
                errs => 1);
        }
 
        private static int RunShowAndReturnExitCode(ShowOptions options)
        {
            if (options.Verbose && !string.IsNullOrEmpty(options.File))
            {
                Console.WriteLine("Source File: {0}", options.File);
            }
            Console.WriteLine("displaying file");
            return 0;
        }
 
        private static int RunAddAndReturnExitCode(AddOptions options)
        {
            if (options.Verbose && !string.IsNullOrEmpty(options.Source))
            {
                Console.WriteLine("Source of Files: {0}", options.Source);
            }
            Console.WriteLine("adding files");
            return 0;
        }

And finally, here is the sample output:
C:\blog2>BlogPost.exe
BlogPost 1.0.0.0
Copyright c Microsoft 2017

ERROR(S):
  No verb selected.

  add        Add files to the site.

  show       Display the contents of a file.

  help       Display more information on a specific command.

  version    Display version information.


C:\blog2>BlogPost.exe help add
BlogPost 1.0.0.0
Copyright c Microsoft 2017

  -s, --source     (Default: .) The source directory for the files to process.

  -v, --verbose    (Default: false) Prints all messages to standard output.

  --help           Display this help screen.

  --version        Display version information.


C:\blog2>BlogPost.exe add
adding files

C:\blog2>BlogPost.exe add -v
Source of Files: .
adding files

A truly valuable addition to an already great library.  Next, I will get back to the PnP.Core provisioning series.  See you tomorrow!  Don't forget to unit test!

Follow up on Command Line Parser Library

In the last post I mentioned building command line tools with arguments and then using PowerShell provide a repeatable experience.  I think it is worth some additional time to talk through some additional options the Command Line Parser Library has and how to write a PowerShell script to read arguments from an XML file.

First, let look at some simple PowerShell code to call our command line tool.

[xml]$configurationDocument = Get-Content .\install.xml
If($configurationDocument.configuration.userName -ne '') 
    {$userName = $configurationDocument.configuration.userName}
If($configurationDocument.configuration.password -ne '') 
    {$password = $configurationDocument.configuration.password}

.\BlogPost -u $userName -p $password


The important part is the first line which reads the content of an XML file and makes it available within your script.  With the following XML:

<configuration>
    <userName>larrys@corp</userName>
    <password>mYP4$$w0rd!</password>
<!--
    <userName>otherName@domain</userName>
    <password>My0therP4ssw0rd!</password>
-->
</configuration>

I get the following results:

PS C:\blog> .\BlogPost.ps1
User Name: larrys@corp
Password (obscured): .........
No Default: False
Required: something hard coded
CommandLine.ParserSettings
Doing my work now.
PS C:\blog>

Next, the arguments I showed in the last post were strings and bools.  The bool had a default value, the strings did not, and neither were required.  This is the easiest type to use, and is the default behavior.  The library is very flexible though, and allows us to customize the behavior through settings. By changing from the Default parser to an instance we can pass a ParserSettings object into the constructor to alter the default behavior:


            var mySettings = new CommandLine.ParserSettings();
            mySettings.CaseSensitive = true;
            mySettings.HelpWriter = Console.Out;
            mySettings.IgnoreUnknownArguments = false;
            mySettings.MutuallyExclusive = true;
            mySettings.ParsingCulture = System.Globalization.CultureInfo.CurrentUICulture;
            var parser = new CommandLine.Parser(mySettings);
            var options = new Options();
            if (parser.ParseArgumentsStrict(args, options))
            {
If we examine the ParserSettings object, we can see that we can have case sensitive arguments (default is false), we can re-direct the output to a different stream, we can have it ignore unknown arguments, we can create sets of mutually exclusive arguments, and we can set the parsing culture.  With the settings above, I added the following options:



        [Option('r', "required", Required = true,
            HelpText = "This is a required parameter.")]
        public string Required { get; set; }
 
        [Option('t', "no default", 
            HelpText = "This is bool parameter with no default.")]
        public bool NoDefault { get; set; }
 
        [Option('a', Required = false, MutuallyExclusiveSet = "abc", 
            HelpText = "Choose A, B, or C")]
        public string ChoiceA { get; set; }
 
        [Option('b', Required = false, MutuallyExclusiveSet = "abc",
            HelpText = "Choose A, B, or C")]
        public string ChoiceB { get; set; }
 
        [Option('c', Required = false, MutuallyExclusiveSet = "abc",
            HelpText = "Choose A, B, or C")]
        public string ChoiceC { get; set; }

Which produce the following results:
C:\blog>BlogPost.exe -r bob
No Default: False
Required: bob
CommandLine.ParserSettings

Doing my work now.

C:\blog>BlogPost.exe -r bob -a something
No Default: False
Required: bob
Choice A: something
CommandLine.ParserSettings
Doing my work now.

C:\blog>BlogPost.exe -r bob -a something -c another
BlogPost 1.0.0.0
Copyright c Microsoft 2017

ERROR(S):
  -a option violates mutual exclusiveness.


  -v, --verbose       (Default: True) Prints all messages to standard output.

  -u, --userName      Username for a user with rights to create lists on the
                      destination site.

  -p, --password      Password for a user with rights to create lists on the
                      destination site.

  -r, --required      Required. This is a required parameter.

  -t, --no default    This is bool parameter with no default.

  -a                  Choose A, B, or C

  -b                  Choose A, B, or C

  -c                  Choose A, B, or C

  --help              Display this help screen.
In the next post on the Command Line Parser Library I will talk a bit about verbs.

Friday, January 27, 2017

Quick post about Command Line Arguments in C# Console Applications

As you might can tell by the context switch to the PnP.Core series, I am building an install utility for SharePoint for my current client.  One of the things that I always try to do is provide artifacts to clients that are easy to use and flexible.  The way that I accomplish that with command line tools is to utilize the Command Line Parser Library to handle any inputs necessary to choose what the application is doing.  I also use an XML configuration file which I read using a PowerShell script which builds the argument list for the exe.  My goal is to make it a repeatable process.  What I realized today when the requirements of the tool changed (shocker, I know) while doing a mock deployment with their engineering team, is that it also makes it extremely easy to maintain and modify.

To get into some non-specific specifics, I had been told that I needed to use one type of authentication to connect to their SharePoint servers and it worked great in the QA environment but was failing in UAT.  I use a completely different authentication method within our Dev environment, so I wasn't really shocked, but we had 8 people on the call so turning the change around quickly was important. All that I had to do is make the required authentication change, add another command line argument, add a line to the configuration file, and update the PowerShell script to pass it in.  It took me less than 10 minutes to update both deployment programs (obey Curly's Law!) and provide bundles to the installation team.  Everyone was a little shocked because they expected us to have to pick the test up again on Monday to give me time to make the necessary changes.

As an aside, this isn't uncommon in large enterprises.  They often suffer from a bad case of Snowflake Servers and that leads to friction during the deployment process because each environment you move into is a new challenge.  Separation of Concerns also keeps you from being able to do your own deployment into at least UAT and Prod, but if you plan ahead and know that there are going to be environmental differences you can engineer the required flexibility into your tools.

Anyway, back to the point.  It is easy to add your command line argument parsing using the library.  You build an Options object with decorated properties and pass an instance of it and the args array to a static method and you are done.  Here is a simplified framework for your Main method:

    static void Main(string[] args)
    {
        var options = new Options();
        if (CommandLine.Parser.Default.ParseArguments(args, options))
        {
            // do your stuff here to validate the parameters
            // and to setup how the program runs
        }
        if (options.ShowingHelp)
        {
            return;
        }
        // do the actual work

    }
\Here is a sample Options object:

    class Options
    {
        [Option('v', "verbose", DefaultValue = true,
          HelpText = "Prints all messages to standard output.")]
        public bool Verbose { get; set; }

        [Option('u', "userName", Required = false,
          HelpText = "Username for a user with rights to create "+
            "lists on the destination site.")]
        public string UserName { get; set; }

        [Option('p', "password", Required = false,
          HelpText = "Password for a user with rights to create "
            +"lists on the destination site.")]
        public string Password { get; set; }

        [ParserState]
        public IParserState LastParserState { get; set; }

        [HelpOption]
        public string GetUsage()
        {
            ShowingHelp = true;
            return HelpText.AutoBuild(this,
              (HelpText current) =>
              HelpText.DefaultParsingErrorsHandler(this, current));
        }

        public bool ShowingHelp { get; set; }

    }

Getting started is as easy as:
  1.  Create your console application
  2.  Install the NuGet package
Command Line Parser NuGet Package

  3.  Add an Options object with decorated properties
  4.  Call the parser in your Main method
  5.  Use the results

Within the section for reacting to the options, I generally add lines like this to give proactive feedback to the person using the program:

    if (options.Verbose && !string.IsNullOrEmpty(options.UserName))
    {
        Console.WriteLine("User Name: {0}", options.UserName);
    }
    if (options.Verbose && !string.IsNullOrEmpty(options.Password))
    {
        Console.WriteLine("Password (obscured): {0}", ".........");

    }

I hope this helps!  Keep your code clean, see you tomorrow.

PnP Core Provisioning: Part 4

The PnP.Core template is more versatile than the built in SharePoint template because you can composite templates on the same site.  By mixing and matching lists and features, you can create a custom solution from component parts.  You can also create a data only template that will populate data for your QA team and only apply it into QA.  The ability to generate a template containing data is not built into the PnP.Core library because the team responsible for it don't want to have it thought of as a backup tool.  That doesn't mean that we can't build something ourselves to do it.
For now, let's just look at how we would manually create a template to just add data.  There are actually two ways, one through XML and the other through code.  First, let's look at the XML option.  Here is the template for a simple list for holding countries.  Now, before anyone starts ranting about it, yes, I know that it would be better in most cases to store such flat/static data in a Term Store.  No, I am not going to show how to provision a term store through the template yet.  Here is the XML:


    <pnp:ListInstance
           Title="CountryList"
           Description=""
           DocumentTemplate=""
           TemplateType="100"
           Url="Lists/CountryList"
           MinorVersionLimit="0"
           MaxVersionLimit="0"
           DraftVersionVisibility="0"
           TemplateFeatureID="00bfea71-de22-43b2-a848-c05709900100"
           EnableFolderCreation="false">
       <pnp:ContentTypeBindings>
         <pnp:ContentTypeBinding ContentTypeID="0x01" Default="true" />
         <pnp:ContentTypeBinding ContentTypeID="0x0120" />
       </pnp:ContentTypeBindings>
       <pnp:DataRows>
         <pnp:DataRow>
           <pnp:DataValue FieldName="Title">Canada</pnp:DataValue>
         </pnp:DataRow>
         <pnp:DataRow>
           <pnp:DataValue FieldName="Title">UK</pnp:DataValue>
         </pnp:DataRow>
         <pnp:DataRow>
           <pnp:DataValue FieldName="Title">US</pnp:DataValue>
         </pnp:DataRow>
       </pnp:DataRows>
     </pnp:ListInstance>
This will create a default list of items and populate three rows.  They are just added within the pnp:DataRows section.

The other way is to create a data filled provisioning template is to create an instance of the ProvisioningTemplate object and add a ListInstance filled with DataRow(s).  Once you have populated the collection you can export it to XML.   Here is some sample code that creates a list named Test List and populates one row of data:


using OfficeDevPnP.Core.Framework.Provisioning.Model;
using System.Collections.Generic;

namespace GenerateAPopulateDataTemplate
{
    class Program
    {
        static void Main(string[] args)
        {
            var fileName = "testTemplate.xml";
            ProvisioningTemplate provisioningTemplate
                = new ProvisioningTemplate();
            ListInstance list = new ListInstance();
            list.Title = "Test List";

            var data = new Dictionary<string, string>();
            data.Add("Title", "First Item");
            list.DataRows.Add(new DataRow(data));
            provisioningTemplate.Lists.Add(list);
            System.IO.File.WriteAllText(
                string.Format(@".\{0}", fileName),
                provisioningTemplate.ToXML());
        }
    }
}
Here is the XML templated created by running the code above:



<pnp:Provisioning xmlns:pnp="http://schemas.dev.office.com/PnP/2016/05/ProvisioningSchema">
  <pnp:Preferences Generator="OfficeDevPnP.Core, Version=2.8.1610.1, Culture=neutral, PublicKeyToken=null" />
  <pnp:Templates ID="CONTAINER-">
    <pnp:ProvisioningTemplate Version="0">
      <pnp:Lists>
        <pnp:ListInstance Title="Test List" TemplateType="0" MinorVersionLimit="0" MaxVersionLimit="0" DraftVersionVisibility="0">
          <pnp:DataRows>
            <pnp:DataRow>
              <pnp:DataValue FieldName="Title">First Item</pnp:DataValue>
            </pnp:DataRow>
          </pnp:DataRows>
        </pnp:ListInstance>
      </pnp:Lists>
    </pnp:ProvisioningTemplate>
  </pnp:Templates>
</pnp:Provisioning>


Happy coding!

Thursday, January 26, 2017

PnP Core Provisioning: Part 3

Today was a tough day with provisioning.  I ran into a problem at work and spent all day trying to track it down.  The provisioning engine was failing on a client system, but working on multiple generic farms, both 2013 and 2016.  I finally had to get direct access to do the deployment from a development machine and debug the PnP.Core itself to track down what was causing the problem.  Even then, I was only able to pinpoint that the issue was related to a Document Library that was in the template.  Having found that out, we replaced the Document Library with a List of the same name so that the lookup columns would all still function.  Given that, I thought it appropriate to spend a post showing how to get the source for PnP.Core from GitHub and use it directly.

I am going to be using Visual Studio Enterprise 2015 for this, but it should work fine with Visual Studio 2015 Community.  Open up your Team Explorer and then Manage Connections.  Once there, Clone the repository:
Clone the PnP.Core repository
Once you have your clone of the project, just open the PnP:
PnP.Core solution file
Once you do, you should find 2 projects and a readme.md file:
PnP.Core solution file open
As a quick aside, if you open the readme.md file and see the raw markup:
readme.md file raw markup
then I would strongly suggest you add the Markdown Editor by Mads Kristensen:
Markdown Editor
so that you get the cool split screen:
readme.md viewed with the Markdown Editor
You will want to see it formatted and you will definitely want to read it.  The most important part is the part about choosing the proper build depending on your target environment:

Compiling for SharePoint 2013
SharePoint 2013 depends on version 15 client assemblies, SharePoint 2016 depends on 16 client assemblies whereas Office 365 (SharePoint Online) uses version 16.1 client assemblies. The PnP core solution foresees support for this. The solution contains 6 configurations:

  • Debug: compiles the solution in debug mode using the version 16.1 assemblies (=default)
  • Release: compiles the solution in release mode using the version 16.1 assemblies
  • Debug15: compiles the solution in debug mode using the version 15 assemblies (=default)
  • Release15: compiles the solution in release mode using the version 15 assemblies
  • Debug16: compiles the solution in debug mode using the version 16 assemblies (=default)
  • Release16: compiles the solution in release mode using the version 16 assemblies

Once you understand that, you can easily add a console application to the solution and pick up where we left off in Part 2.  You may well have to do some tweaking to get it working, but it shouldn't be too difficult.  For example, when I tried to compile the first time this time I received 5 errors because of a bad reference to Microsoft.Identity:
Initial Compilation Errors
I just needed to add a NuGet package for it to make the errors vanish:
Adding the Microsoft.Identity NuGet package
and then it built cleanly.  That's it.  Just add your own project to the solution to wrap the DLL and you are able to debug into and make modifications to the library itself.

The Microsoft team has done a great job getting the required packages into NuGet, and an awesome job with the PnP.Core library itself.  If you happen to find an error or a shortcoming in the library, consider making the fix yourself and providing a pull request for the team.  They are very open to contributions and there is an active community constantly improving the product:
Commit Graph
Most of the contributions are coming from outside of Microsoft.  Pitch in and help by looking through the Issues list:
Issues List, Open Issues

or adding your own.  There are currently 156 open issues.  Help get it down to below 100!

Keep your code clean!

Wednesday, January 25, 2017

PnP Core Provisioning: Part 2

Microsoft Patterns and Practices group has developed a deployment tool for SharePoint called PnP Core that allows you to provision structure, data, settings, security, and artifacts to a SharePoint site from an XML input file.  I pulled the captured template and removed some of the nodes.  Below are two fields pulled from the XML that show a Single line of text and a Multiple lines of text field definition:

            <Field Type="Text" 
                   DisplayName="SingleLineOfText" 
                   Required="FALSE" 
                   EnforceUniqueValues="FALSE" 
                   Indexed="FALSE" 
                   MaxLength="255" 
                   ID="{ce3ab098-96f6-444f-83e0-9f3e875a48c5}" 
                   SourceID="{{listid:Part2List}}" 
                   StaticName="SingleLineOfText" 
                   Name="SingleLineOfText" 
                   ColName="nvarchar3" 
                   RowOrdinal="0" />
            <Field Type="Note" 
                   DisplayName="MultipleLinesOfText" 
                   Required="FALSE" 
                   EnforceUniqueValues="FALSE" 
                   Indexed="FALSE" 
                   NumLines="6" 
                   RichText="TRUE" 
                   RichTextMode="FullHtml" 
                   IsolateStyles="TRUE" 
                   Sortable="FALSE" 
                   ID="{564f1328-8812-4b3a-8413-af7983304fd0}" 
                   SourceID="{{listid:Part2List}}" 
                   StaticName="MultipleLinesOfText" 
                   Name="MultipleLinesOfText" 
                   ColName="ntext2" 
                   RowOrdinal="0" />

If you look at the definitions above you can see that the Type of the SingleLineOfText is Text and the type of the MultipleLinesOfText is a Note.  The other attributes are pretty simple to understand.  The DisplayName is what shows in views, and the Name is the internal name.  The ID is the internal GUID that uniquely identifies the column.  It is important that the GUIDs are unique.  If you copy a column and forget to update the GUID and have duplicates, you will get very strange behavior.  You could destabilize your entire farm. The SourceID points to the list that contains the field.