Lotus Notes to SharePoint Blog

Blog about Dell's Notes Migrator to SharePoint tool and other things related to Lotus Notes migration projects

Category Archives: Analysis

Do you require more in-depth design analysis?

After you have completed your Notes application analysis with Notes Migrator for SharePoint, you will probably discover that some of your complex Notes applications will require custom development on SharePoint.  To help you clarify the level of development required and assist the developers performing the actual conversion, we recommend Teamstudio Analyzer for the in-depth design analysis.  Click here for more information on how Quest partners and customers can benefit from using Teamstudio Analyzer.

Notes Migrator for SharePoint 6.1 Reviewers Guide

Notes Migrator for SharePoint 6.1 is a significant product release that pushes the product further in three important areas:  Design Migration, Content Migration and Pre-Migration Analysis.  This release will be “Generally Available” on the Quest web site in a few days.  If you were in the beta program you already have the final build (6.1.0.626).  Below is a high level list of this release’s most important new features.

Design Migration

Migrate Notes Views to SharePoint Views – This feature will allow you to select a Notes view and generate a similar view in any SharePoint list or library.  Because the mapping of columns in your view is intimately tied to the mapping and provisioning of data columns that occurs in your migration job, this new capability is also tied to your migration job.  Specifically, view migration is found on the new Advanced tab of your Target Data Definition.  You can design new views there and the views are provisioned when the migration job is run.  A nice side effect of this model is that you can run the same migration job against many new or existing lists or libraries.

clip_image001

clip_image002

Press “Import from Notes” and then select the Notes view you want to migrate.  The view migration wizard will then attempt to design an equivalent SharePoint view, mapping as many Notes columns as possible to existing SharePoint columns OR to new columns defined in your job’s target data definition.  As you might expect, there are a number of things that can go wrong with such an automated mapping.  The two most likely issues are (1) the Notes data item shown in the view had not been migrated to the SharePoint list yet or (2) the Notes view column is computed and the formula is too complicated for a tool to convert.  For issues like this, manual intervention is required and the migration wizard guides you through that.  In the View Columns step, it gives you a side by side view of the Notes and SharePoint columns and highlights the parts that it needs help with in red.  You can decide to manually map a Notes existing columns, define a new data column (which will be added to your migration job), or even specify a formula for a new computed column in SharePoint.  Similarly you can also specify the sort order, the grouping and the document selection rules, or you can just take the defaults. 

Content Type Generation – These feature takes the tool’s ability to migrate schema (select fields from a custom Notes application and provision a similar schema in a custom list) to a new level.  Now you can migrate your custom Notes application schema to SharePoint content types instead.  You can create new Content Type Definitions on the new Content Types tab of your Database or Technical Class records in the Migration Console.  You can design them from scratch or, more likely, you will generate the Content Type definitions from a Notes Form, from an existing migration job, or from an existing list that you have been working on.  (The last two options allow you to do your initial development and testing on a custom list and then transition to content types later.)

clip_image003

clip_image004

Once you have generated an initial Content Type Definition, you can further customize it, specifying the parent content type, the group, and the columns.  The columns editor looks like the Target Data Definition columns editor, but in this case you are defining Site Columns instead of List columns.  When you are ready to provision your new Content Type, the tool will try to find existing Site Columns that match your specification and will provision new ones if needed. 

clip_image005

clip_image006

New Column Provisioning Options – The ability to provision lists and libraries while migrating content has long been a popular feature.  This capability has been greatly expanded to allow provisioning a nearly complete list of SharePoint column options, such as default values and data validation.  You can even set columns to be read-only (something you can’t normally do in the SharePoint user interface).  Note that as before, the provisioning options only apply when the tool is first provisioning the columns; it does not upgrade existing columns.

clip_image007

Provision Calculated columns You can now define Calculated columns in your target data definition.  These will be provisioned in SharePoint when your migration jobs are run. This Microsoft documentation page explains the legal syntax for SharePoint formulas:  http://msdn.microsoft.com/en-us/library/bb862071.aspx.  You can also use our new built-in formula editor, which appears at several points in the product.

clip_image008

clip_image010

Provision Lookup Fields – Lookup fields can now be provisioned in SharePoint lists.  Previous versions of the tool allowed you to migrate to Lookup fields, but you first had to manually configure them in SharePoint. You can even configure lookups from lists in other sites in the same site collection (something you can’t normally do in the SharePoint user interface).

clip_image011


Content Migration

New Run Job options – During normal migration jobs, Notes Migrator for SharePoint first takes care of the provisioning steps (creating the list/library if needed, adding the appropriate content types to the list, adding any missing columns if needed, and creating views, setting the list/library permissions) and then migrates the content.  Now you can now choose to run just the provisioning bits without migrating the content.  This will be useful in cases where you want to review the resulting list schema and perhaps make changes to the list settings before migrating content.

clip_image013

clip_image015

Also note that the above screen shots show off the tool’s new Run Job button with the drop down menu of various modes of running the tool, such as importing data to and importing data from intermediate XML files.  Since this button is also available in the Migration Console, users can now leverage the intermediate file options from there (it was formerly available in the Designer Client only).  Similar functionality is also available in the tool menus and the bulk Migrate To SharePoint task in the Migration Console.

Migrate by Form Used – When selecting records from your Notes database, you can now elect to query records by form.  You can pick one or more Forms from the Notes database design or type them manually.  This will limit the records selected to include only those data records that were flagged with the forms you selected (i.e., where the “Form” item was set to one of those form names).

clip_image017

clip_image019

If you specify forms in this way, the Select Items and Columns dialog will also display the data items defined on your selected forms.  (Remember, however, that background agents, etc., may have also set data items on the Notes documents, so you may still need to look at the Sampled Items node to discover those.)

Improved Content Type Support in Migration Jobs – Notes Migrator for SharePoint already had pretty good support for dealing with content types while migrating content, but this release extends those capabilities and also makes it much easier to use.  The new Manage Content Types section on the first tab of the Target Data Definition allows you to push site content types into a new or existing SharePoint list.  Adding the content type to the list is a prerequisite to writing documents with those content types, so this new feature eliminates that manual step and make more automation possible. 

clip_image021

In addition, you can now associate each content type with one or more Notes forms.  If you do this, then the tool will automatically assign the content type for each migrated document without the need to do any explicit mappings.  (For more complex scenarios you can still explicitly map any piece Notes data, including the result of a formula evaluation, to a ContentType field.)

Note that this feature is completely independent of the ability to generate new Site Content Types, as described above under Design Migration, but you may often find yourself using the two in combination.

Normalize multi-valued items to multiple documents – It is common for Notes forms to implement detail records (the things that relational developers would have set up a one-many-relationship for) using multi-valued data fields arranged to look like a table, as shown below. Users would then fill in as many entries as the needed, making sure that the values from each multi-valued item lined up properly.

clip_image023

clip_image025

Now you can use Notes Migrator for SharePoint to extract the multi-valued items as separate records. To do this set the Multi-Value disposition option for the columns that you expect to contain arrays (ContactName, ContactTitle, and ContactPhone in the above example) to “Normalize”. This will cause NMSP to generate multiple SharePoint items (three in the above example) for the values in the columns designated for Normalization.

clip_image027

clip_image029

Note that if you also wanted “header” records you could first migrate those using a separate migration job. You could use lookup fields or some other mechanism to relate the “header” and “detail” columns. In the above example, Customer ID in the Contacts list is set up as a lookup field that references the Customers list (where Customer Name and Category are also stored).

Finally, you may encounter Notes forms that list out each item in separate rows, instead of using multi-valued items. NMSP can handle that case as well, but the migration team will need to do a little work with formulas to generate multi-valued columns to Normalize on.  For example, you could define a source data definition column with the formula “Product_1: Product_2: Product_3: Product_4” to generate a “Products” array.

clip_image031

Dynamic link tracking for URL columns – When migrating Notes data to SharePoint URL fields, you can now leverage the tool’s popular Link Tracking Service for those links (the same way you currently do for Doc Links in Rich Text Fields).  You do not need to do anything special to enable this.  Simply start migrating data items that store notes:// links (or formulas that generate notes:// links) and they will be converted to dynamic links that will ultimately point users to the migrated versions of those documents. 

As a reminder, the tool supports two input formats for Url fields.

  • <url> -  sets the url to be the display name in SharePoint
  • <url>, < name> -  sets the url and display name separately in SharePoint

Improvements in “Render With Form” function – This very popular feature, which allows you to extract the content of any custom Notes database (regardless of the complexity) so you can archive them as simple rich text documents in SharePoint, has been improved in a number of respects. Handling of dynamic subforms, computed fields, computed text, and keyword fields have been improved, resulting in even better looking rendered documents.

Provision Alternate Libraries – Users have often relied on the tool’s unique ability to migrate images and attachments to a separate location from the main documents being migrated.  For example, you might want to place all attachments in the Shared Documents or Site Assets document library.  Previously you would have had to provision these alternate libraries manually before running your migration job.  Now the tool provisions these automatically if needed.

clip_image032


Analysis / Migration Console

Capture Design Copies during Analysis – Now when you perform a Design Analysis on a set of databases you can elect to also create a local design copy.  For every database scanned, the tool will create a small copy of the database (design elements only, similar to a design template) on the local machine on which the design analysis is being performed.  The intent of this feature is to allow for consultants and other migration personnel to be able to view the full database designs even while they are disconnected from the production environment, which is particularly important on large analysis projects where in-depth manual design analysis is required.  The location of these databases is controlled by a setting on the Analysis tab of your Global Options dialog, and defaults to a folder in the tool’s <ProgramData> area.  Users can the easily open the Local Design Copy from any database view.

clip_image033 clip_image034

Classify by Last Used (All Replicas) – This enhancement expands the available options for automatically determining which Technical Class or Business class a databases belongs to.  Now you create a rule for recognizing class members based on when it was last used across all known replicas.  For example, you can create a class that groups all databases with a Last Used date greater than 356 days ago.

clip_image036

Import data into repository from CSV files – This enhancement allows users to read in records from a CSV file and update vales in certain Quest Repository database records.  The tool is available on the main Notes Migrator for SharePoint node.  First the user is prompted to select a CSV file and then a mapping dialog is displayed.  For each column in the CSV file, the user can choose to map it to one of the available database record properties.  Not all database properties can be imported from external data sources.  The intention here is that users who perform manual analysis/triage projects using spreadsheets or external tools can import that data back into the Migration Console.

This feature can also be used to add new database records into the repository (by Server and File Path, Replica ID, or Database Key).  The intention here is to allow users to import a list of databases to be analyzed or migrated.  Users would typically follow up such an import with a full analysis of those databases to populate the remaining database properties.

clip_image038


General

Performance and scalability for Migration Console – The database views have been re-architected to support 60,000+ databases at a time.  With a very large repository, users may experience a delay while starting the tool as all the records are loaded into memory.  After that, scrolling, sorting and filtering database views and opening additional views should be very fast.  Special protections have also been added to prevent you from running very large reports that are likely to crash the migration console.

New User/Group Mapping options – When performing user/group name mapping, the Output Translation option now allows you to further transform the name that results from your name lookup before submitting that name for resolution in SharePoint.  This may be useful when your environment requires specific name formats that are not immediately available in your mapping source.

Also, the Test User/Group Mapping tool now has a Validate in SharePoint button which will try to resolve the name that your configured mapping process produces so you can verify that it really works in SharePoint. This should make experimenting with and debugging various user mapping options a little easier for everyone.  Note that this capability is only available when using client-side user mapping (not when configuring server-side mapping in the Import Service).

clip_image039

Improved CBA/FBA support – Connections to SharePoint sites using Forms Based Authentication now automatically renew cookies as needed during long migration jobs and other operations.  The tool will routinely check the expiration time of the authentication tokens that it holds for the client and, if a token is due expire within a certain window, it will force a new authentication.  (The time limit will be 30 minutes, but this can be changed in your SharePoint Environment settings.)  Depending on your particular authentication system, this may appear to the user as a browser prompt forcing the user to re-authenticate.  In other cases, it may appear as a browser window that opens briefly and then closes again.

clip_image040

Windows Authentication using alternate account for Link Tracking Database One new connection option is available for connection to the Link Tracking Database.  You can now specify that you want to use Windows Authentication, but supply an account other than your own.

clip_image041

Bulk Editing of certain database / class / job properties – Expanding on the tool’s capability to select multiple databases and set properties in bulk, you can now set a number of additional properties in a large number of selected databases.  You can even set certain properties inside the migration jobs assigned to the selected databases.  Finally, you now have a similar set of options for Technical Classes and Business Classes as well.

Usability improvements A number of things have changed in the product to improve to make the tool easier to use, including rearrangement of dialogs and additions of menus.  More pop-up help icons and more context sensitive links to help topics have been added.  A completely rebuilt “Add/Remove Columns” dialog and the built-in documentation describing the 170 possible built in columns makes customizing views much easier.

clip_image043

clip_image045

New Webcast: Migrating Lotus Notes Applications to SharePoint Online in Office 365

I recently co-presented a web presentation with Notes integration/migration rock star Gary Devendorf.  I was very happy with the results and the amount of “beef” we managed to squeeze into a marketing event.

See the recording here:  http://www.quest.com/events/ListDetails.aspx?ContentID=15382

image

Getting Started with new Analysis Features in version 6.0

Last Design Modified / Modified By.  We have now added Last Modified Date and Last Modified By to the detailed design analysis information we collect.  This is visible in the Design Element Details dialog and is also included in our export to XML files.

clip_image001

Similarly, the Modified By information is now available in the Design Element Differences dialog (shown when comparing with a template or reference database).

clip_image002

Finally, the latest of the design modifications shown above is rolled up into the Design Modified Date and Design Modified By properties for the database records themselves and is available in views and reports.   Note that Design Modified Date was already available here and came from the database header scans.  Now we replace the header information with (possibly) more precise information when you do the Design Analysis.

clip_image004

Timeout on long analysis tasks.  We have added a global option which will set a limit for how much time the tool will spend analyzing and one database.  If the time is exceeded, the scan will stop and the tool throw an exception, just like it would if the database was not accessible.  If you have the tool set to log exceptions without interruption, it will immediately move on to the next database.

clip_image001[4]

Resolve truncated user names in usage analysis data. Since the data that Domino stores for user activity truncates user names to 32 characters, we have added a way to resolve truncated user names in Domino Directory.  This will be an option, since we know it will slow down usage scans and will only be available in the Recompute function.  This feature uses the configured User/Group resolution server for performing the Domino NameLookup functions.  Note that you can select multiple databases for the Recompute and leverage cashing of any NameLookup operations for the entire group.  The tool always looks for “known long names” in the acl and data analysis first to avoid doing the NameLookup whenever possible.

clip_image002[4]

clip_image003

Understanding DEI and other complexity metrics

The analysis component of Notes Migrator for SharePoint calculates several complexity metrics for each database, including Microsoft’s Design Element Index (DEI) method.  These are described at a high level in the documentation, but we sure do get a lot of questions about exactly how those numbers are computed.  I will attempt to give an answer here, and then at the end of the article I will describe why you should NOT take these numbers too seriously.

 

Microsoft’s DEI method is really simple.  (As I have said many times, it is currently TOO simple.)  We just count up each type of each element and decide which DEI column it goes into, according to Microsoft’s guidance.

        switch (Type)
        {
            case “Form”:
                if (Net < 2)
                    return 1;
                if (Net < 5)
                    return 2;
                if (Net < 11)
                    return 3;
                if (Net < 15)
                    return 4;
                return 5;
            case “View”:
            case “Folder”:
                if (Net < 5)
                    return 1;
                if (Net < 10)
                    return 2;
                if (Net < 20)
                    return 3;
                if (Net < 40)
                    return 4;
                return 5;
            case “Page”:
            case “StyleSheet”:
                if (Net < 1)
                    return 1;
                return 3;
            case “Agent”:
                if (Net < 1)
                    return 1;
                if (Net < 5)
                    return 2;
                if (Net < 10)
                    return 3;
                if (Net < 20)
                    return 4;
              return 5;
            case “DatabaseScript”:
            case “Subform”:
            case “ScriptLibrary”:
                if (Net < 1)
                    return 1;
                return 3;
            default:
                return 1;
        }

Based on these formulas, we get a DEI ranking for each type of element:

clip_image002

The max is 3 in the above case (in blue) and the average is just a simple average of the six 1’s, one 2, and five 3s (in red).  (6×1 + 1×2 + 5×3) / 12 = 1.9.  In version 5.0 and 5.1 we always rounded down (to 1 in this case) but in 5.2 we round up or down (2 in this case).

 

Quest’s complexity algorithm is probably a little more realistic than DEI, but is still suffers from the above limitations and again should be taken as a first approximation for now.  One cool thing that out algorithm allows you to do is focus your manual analysis work on the templates and then let our tool calculate the INCREMENTAL complexity (based on the deltas from the template).  Incremental complexity only considers the design elements that were new or modified from the associated reference database.

Plus we have Data Complexity (rich text with embedded ole objects is more complex than plain text fields, etc.)

You can decide how you want to weight the above three complexity numbers using our configuration settings.

At the end of the day, none of these algorithms look inside the actual LotusScript code to see what it does, etc.  Until we start doing that (in a future version) we definitely tell people to take this as a first approximation that will tell you where to start doing manual analysis.  Our tool allows you to override the automatic calculations with the results of your analysis work. 

For a lot more detail on what I believe is the RIGHT way to think about application complexity, see the white paper described here:  [Streamline Your Migration of Complex Notes Applications to SharePoint].

All Database View/Report Columns

With each new release of Notes Migrator for SharePoint, we add new functionality and this usually means new database properties that you can view in the Migration Console.  For example, our new “Blocked / Oversized File Detection” feature gives you four new columns that you can add to any database view or report:

image

With version 6.0, we are now up to 172 columns that you can display for each Notes database in your environment.  These columns are set in a variety of ways:

  • Discovery
  • Data Analysis
  • Design Analysis
  • Usage Analysis
  • Automatic Classification
  • Automation via Class Rules
  • Manual Triage
  • Performing Migrations

While we document all the available database columns in the appendices of our User Guide, it still can be difficult to understand what columns are available to solve a particular problem or where a given piece of data comes from. 

I have, therefore, created a new spreadsheet that includes extended documentation and is sortable in a number of ways.  Download it here: [NMSP 6.0 Database View Columns.xlsx].

Extracting all the users from a set of databases

The Extract Database Users tool (new in Notes Migrator for SharePoint 5.3) allows you to select one or more databases in any database view in the Migration Console and then extract all the user names contained in those databases.  This tool is useful for simply gaining an understanding of the users involved in a group of Notes applications, but the primary purpose of the tool is ultimately to generate a user mapping file that NMSP can use at migration time.

clip_image001

Depending on how much analysis has been done for these selected databases, we may extract user names from the database ACLs, the Created By/Modified By metadata, the document level security, or the usage activity.  As explained below, users may also be added to the list by expanding Domino groups and by importing existing NMSP User Mapping XML files.  The sources of the user names are listed in the view columns shown below, and you can filter the sources shown to only list users that came from certain sources.

clip_image002

The type of user name (Person, Group, Unspecified) is also shown and you can filter based on them.  Note that Unspecified users may become specified as you perform certain operations such as group expansion or imports.  Finally, you can manually set the user type by using the combo boxes in the view. 

clip_image003

You can also filter by the Notes domain.  This is a simple text match against the last part of the abbreviated name, so either “Westford/IBM” or just “IBM” would select “John Smith/Westford/IBM”.  If is common to want to select all the users plus all the groups, which the “No domain” checkbox allows you to do.

clip_image004

If a group is listed in a database, it is sometimes useful to be able to find all the members of the group.  If you press the Expand Groups button, NMSP will contact the configured Group Resolution Server (from Advanced Configuration) and look up every Group and Unspecified entry (in case it really is a group).  Any new members will be added to the list and indicated as “ACL via Group”.

clip_image005

To remove users from the list, you can select one or more row using the selection column on the right (or press Control-A to select all of them) and press the Remove button.

clip_image006

The last column in the view is the SharePoint names column.  You can set these names automatically using the Import function, using the Set SharePoint Names function, or by typing them in manually. 

The Import process loads in users from existing XML User Mapping files and sets the Imported column.  Imported data is merged with existing data but if a SharePoint name is specified in the imported file, it will overwrite the existing name every time. 

The Set SharePoint Names function gives you several ways to automatically assign your SharePoint names. 

· Load users from Domino Directory – use any field in the user’s Person document on the Domino directory as the new SharePoint name

· Set Default using Format String – Generate a new SharePoint name by substituting the various parts of the Notes name.

· Set Default using the Notes common name  – Use the simple common name as the SharePoint name

Note that in all these cases, existing SharePoint names will be preserved unless the override SharePoint names flag is already checked.

clip_image007

Finally, you can press the Export button to generate a User Mapping file (either in an XML or comma delimited format).

clip_image008

Specifying alternate repository database with -rd command line option

The Notes Migrator for SharePoint Migration Console supports the ability to read a repository database argument from the command-line.

To use this feature, open a command prompt and start the console like this:

mmc.exe C:\NotesSharePoint\Migrator\MigratorConsole\migratorconsole.msc -rd=YourTemporaryRepository.nsf

The console will start with the database you specified.  As long as you don’t save your configuration, your former repository will be used the next time you start the console without the “–rd” parameter.

To indicate an alternate repository is loaded, the Root Node label in the navigation tree will have the repository name appended.  Example:  “Notes Migrator for SharePoint (TempRepository.nsf)”

The way I use it most is the way our system integrator partners use it:   Different clients have different repository databases, each containing data from that customer’s Notes environment.  It is common to want to quickly switch between them.

Another use is to segment a large environment.  As performance of redrawing views, etc., degrade when you have over (depending on your environment) 2000-5000 databases loaded, a customer with a 20,000 database environment might want to segment it all into 4 – 10 different repository databases.

Migrating Lotus QuickPlace to SharePoint (Part 3): Discovery and Automation

In Part 2 [link] we examined the ins and outs of QuickPlace / QuickR migration jobs.  All the examples were using the Notes Migrator for SharePoint Designer Client, which specializes in designing and running one migration job at a time.  As discussed, QuickPlaces usually require multiple migration jobs in order to move the right content to the right type of list in SharePoint (Pages, Tasks, Discussion, Calendar, etc.).  Not only that, most QuickPlaces have multiple (sometimes hundreds of) rooms and sub-rooms, each of which would typically be migrated to a distinct SharePoint site or sub-site.  This means that migrating even a small number of QuickPlaces would get very tedious very quickly if you had to migrate each bit of content in every sub-room using the Designer Client.

Enter the other client, the Notes Migrator for SharePoint Migration Console.  This is the client that spans thousands of Notes databases instead of just one at a time and allows for a great deal of automation of your migration tasks.  This is extremely valuable in any large migration project, but is especially valuable for QuickPlace / QuickR projects.  Most QuickPlace rooms and sub-rooms are based on standard templates and, even though they tend to proliferate very quickly, they usually lend themselves to automated provisioning and automation quite nicely.

First a short word about machine requirements.  Although the tool is documented to require less, I definitely suggest that you have at least 4 GB of RAM available if you have a large environment to analyze and migrate.  Also watch out for your repository size.  If you have more than 4000 – 5000 databases you should consider splitting the project into multiple separate repositories, as described elsewhere.

(Warning!  Some of the features described below are only available in the Notes Migrator for SharePoint 5.x Premiere Edition today.  When version 6.0 is released, they will be available in the Starter and Standard Editions as well.)

Discovery

Everything in the Migration Console starts with Discovery.  This is the part where the tool scans your various Domino servers and finds all of the databases that are available there.  There are actually several types of Discovery you can perform with the product, all of which are available as actions on the main Notes Migrator for SharePoint scope node (top left) of the console. 

image

The first type of discovery is the general Discover Databases function.  This discovers all Notes databases on the server, regardless of whether they are really QuickPlace databases or not.  As shown below, you can pick the Domino servers you want (click on Manage Locations to add another server to the global list) and indicate that you would like to do a header analysis along the way. 

The tool attempts to classify all Notes databases based on the classification rules programmed into the tool (more on that later).  It can recognize that a database appears to be based on a QuickPlace template, but it does not attempt to look much deeper than that.  This is why you will see classes such as “QuickPlace (Unorganized Content)” in the various Notes Databases views; we have not really looked inside the database yet to see whether it is a room or sub-room and where it fits in a QuickPlace hierarchy. 

While this type of discovery is optional for a QuickPlace migration project, it can be very useful in cases where you have orphaned databases (no longer part of any hierarchy) or databases that you can’t access using your current Notes ID.  There will remain as “QuickPlace (Unorganized Content)” even after you do the deeper discovery (below).

image

Next you can also perform Discover QuickPlace Organization or Discover QuickR Organization operations.  As shown below, you can pick the QuickPlace servers you want (click on Manage List to add another server to the global list).  This type of discovery will crawl the metadata inside the QuickPlace databases and determine how they are structured in QuickPlace application terms of places, sub-rooms, etc.  Note that the classification of these databases will change to “QuickPlace” and “QuickPlace Sub-room”, etc., instead of just “QuickPlace (Unorganized Content)”.  More importantly, notice that you can now view the hierarchical QuickPlace application model under Applications scope node. 

image

Crawling a QuickPlace hierarchy is significantly slower than doing a simple database discovery.  For very large environments, where one QuickPlace might have hundreds of rooms and sub-rooms, you may want to discover just one QuickPlace at a time.  The trick to doing this is to specify a discovery location using the “<server>;<place>” format, where <place> is the short name of the QuickPlace (as you would see it in a browser URL).

image

Finally, you may want to perform a Discover Directory Entries operation.  This part is certainly optional, but will examine your Domino directory to determine which of your databases was mail-enabled (and many QuickPlace databases are).  This scan will even tell you what the mail address was for the mail-in databases was, which can be useful information when configuring similar behavior in Exchange/SharePoint. 

image

Analysis

Notes Migrator for SharePoint can help you do a great deal of analysis on each database you discovered.  Analysis is discussed in great detail in other posts, but I will summarize the capabilities here:

  • Analyze usage patterns, filtering out any names that should not count as true end-users
  • Analyze data complexity, including how many pages of each page type there are in each database 
  • Determine which documents have blocked or oversized file attachments that will not be allowed on SharePoint
  • Analyze design complexity, determining which QuickPlace rooms have been customized by end users
  • Extract all the user/group names used in a QuickPlace, so you can plan how to map them to Active Directory accounts (and deal with exceptions)

Note that all Discovery and Analysis data is available in the Database Properties dialog and most of it is also available in the Migration Console views.  You can design custom views and reports that display it or simply export it all to an XML file for reporting in Excel and other tools.

Defining the rules for provisioning and migration

Now we get to the really cool part: all that Discovery work you did above was not just so you could see what you have out there on your QuickPlace servers.  All that data you collected is going to be your “home base” for managing and automating much of your migration project.  The piece that makes all this happens is your classification rules.

Underneath the Classification Rules scope node you will see both Technical Classification Rules and Business Classification Rules.  For the purposes of automating your QuickPlace migrations we will be concerned with Technical Classification Rules and, in particular, the “QuickPlace” and “QuickPlace Sub-Room” classes.  (Substitute “QuickR” and “QuickR Sub-Room” classes if that is what you have.)

image

If you edit the “QuickPlace” class rule properties, you will see various details about how databases of this type are recognized, analyzed and triaged.  For now, skip over to the Auto Target tab.  Check the “Enable automatic Target Identification” checkbox and then select one of your existing SharePoint site collections as the “Base Site URL”.  This sets the top level under which all new QuickPlace rooms and sub-rooms will be migrated to. 

image

Next, check the “Create new site for each database” checkbox and press the Options button to specify how the site will be created.  The Name and Relative URL should be variable (using the substitution codes indicated at the bottom) as this class rule will be applied to many QuickPlaces.  In example below, the URL will be formatted using the original folder path.  The name and description will include the original QuickPlace application name.  Finally you can specify the SharePoint site template you want to use (the standard SharePoint Team Site template works nicely, but feel free to specify one of your custom templates) as well as other site creation options.

image

Similarly you should check the “Assign targets for every child application” and “Create new sub-site for every child application” and then set the site creation options, as described above.  This will cause all QuickPlace sub-rooms to get migrated as SharePoint sub sites.  As you will see shortly this is a recursive process that could generate a tree of hundreds of sub-sites automatically!

image

The last item on this tab is optional.  As discussed in Part 1 of this series, QuickPlace rooms have their own menus which are often customized and are often considered an important part of the design of the QuickPlace.  Depending on your particular situation, you may elect to use the standard Quick Launch menus defined in the your SharePoint site template (selected above) or you may decide that you want to migrate the old QuickPlace menus over to SharePoint instead.  If you want to replace the standard menus with migrated QuickPlace menus, check “Provision Navigation Links” and the “Replace Links”.

image

clip_image001

IMPORTANT: The “Provision Navigation Links” function uses the dynamic Link Tracking Service and therefore requires that the service be enabled when the actual migration occurs.

Now let’s move over to the Migration Jobs tab.  Check the “Enable Automatic Migration Job” checkbox and then define the desired security mapping options.  Security mapping details are described in great detail in other posts; I will summarize by saying that the options available here controls how QuickPlace access control rules are mapped at the site level whereas individual migration jobs control how QuickPlace access control rules are mapped at the list and individual document level. 

Also note that enabling these options will require you to establish a system for mapping Notes user/group names to Active Directory identities, which is essential for a real migration but might be considered optional for a proof of concept or first test migration where you want to get quick results.

image

Finally we have a set of migration jobs.  Every entry here is a complete migration job, as described in Part 2 of this series.  You can add, edit and delete migration jobs using the buttons at the bottom and the pop-up migration job designer. 

Note that Notes Migrator for SharePoint includes a complete set of migration jobs for QuickPlace and QuickR and you may or may not need to edit them.  The default jobs are designed to do a nice job migrating to sites created with the standard SharePoint Team Site template. 

Note that the top level QuickPlace class has an extra job for migrating the QuickPlace Welcome page to the SharePoint site’s Announcement area.  If you do not want to do that, you can simply delete that particular job.

Good reasons for wanting to customize the migration jobs in your QuickPlace class rules include:

  • You are using a different SharePoint template
  • You have a different idea how lists should be named or organized
  • You want to use Wiki pages, etc., instead of custom lists (the default)
  • You want to enable document-level security mapping for all migration jobs
  • You want to enable the Link Tracking Service for all migration jobs

Be sure to Press OK to save your work!

You should also edit the “QuickPlace Sub-Room” class rule and perform a similar set of edits.  You do not need to specify the Site Provisioning rules, as these were defined for the parent “QuickPlace” job and will applied recursively.  You should however specify Provision Navigation Links options, Security Mapping rules and Migration jobs that you want applied to any SharePoint sub-sites provisioned from your QuickPlace sub-rooms.

Assigning Targets and Jobs to Individual Databases

Now that you have defined your class rules, applying them to your actual databases is very easy.  Under the Applications scope node select the Apply Class Rules action.  Select Yes to include all the sub-rooms (recursively) and then indicate that you want to assign both the Migration Targets and Migration Jobs.  As with most Database properties, Migration Targets and Migration Jobs may be “locked” in some of your databases (because you made some manual entries), so it is usually a good idea to Override All Locks when applying the rules.

image

When the assignment is done, you can examine the log files to get an idea of what happened.  Even better, spot check some of the individual databases to see if things got assigned the way you expected them to.  In the example QuickPlace sub-room, the site path, the security mapping options and the migration jobs were all automatically assigned according to our rules for QuickPlace sub-rooms.

image  image

Note that the site shown above is indicated as a “Planned Site”.  This means that the SharePoint site does not really exist yet, but you can still assign databases and migration jobs to it.  You can even see it previewed in the tool’s SharePoint scope nodes.

image

You can now change any of the assigned properties for specific databases.  In fact a common way to deal with customized QuickPlaces is to first assign a complete set of default rules as described here, and then use the analysis functions of the tool to locate small end-user customizations that have occurred over the years.  You can decide on a case-by-case basis which customizations merit (for example) a customized migration job or even custom SharePoint development.

Migration Time!

There are a couple of tool configuration options (described in detail elsewhere) that you should pay attention to before migrating.

  • Logging level – Use Verbose for debugging, but don’t leave it that way for large migration jobs.
  • User/Group Mapping options – Needed to preserve author metadata as well as access permissions.
  • Link Tracking options – Keeps doc links working, even in an extended migration project

One configuration option that many people miss is the HTTP Link Detection feature.  If you configure the tool with mappings from URL prefixes to live Domino servers, it will try to resolve any HTTP links to other web-enabled Notes or QuickPlace documents and treat them as dynamic links in our Link Tracking Service.

clip_image002  clip_image002[4]

Finally it is time to migrate.  Under the Applications scope node (above) select the Migrate To SharePoint action.  Select Yes to include all the sub-rooms (recursively) and then indicate that you want to perform all provisioning and migration tasks.  (In practice, many people actually decide to provision all the sites and sub-sites first, verify those, and only then migrate all the content.)

image

The process described here is a “soup to nuts” approach for migrating QuickPlace / QuickR environments.  Hopefully you agree that the blend of automation and ability to customize as needed strikes the right balance, and that the trouble of setting the rules up correctly the first time more than pays for itself when you start applying those rules to large numbers of databases.

Best Fit template matching features

Notes Migrator for SharePoint contains the ability to do “best fit” design matching to help identify which applications are based on similar designs, regardless of whether or not they are currently inheriting from the same application template.  This supports the “design consolidation” process which is crucial to reducing the cost/risk of large migration process.  To start using this feature, you need to associate specific Notes templates to specific Technical Classes. 

Prior to version 5.3, Notes Migrator for SharePoint already had the ability to assign a “Reference database” to a Technical Class, which was useful for detecting the deltas of each member of a Technical Class (the first checkbox).  This feature assumed that we had already assigned Technical Classes to the databases – either automatically (by Rule) or manually.  The most common kind of Class Rule used the database’s assigned Template name to assign classes.

image  image

In contrast to this, the new feature helps you use design comparisons to help figure out what Technical Class a given database might be, even if it does not have a Template assigned.  By enabling “Include in Best Fit testing for all databases” (the second checkbox) you can now compare every database you scan with this Reference database.

image

You can set up as many Technical Classes as you want this way.  Think of this as setting up a many-to-many comparison.  Each database you encounter will be compared with the Reference databases of multiple Technical Classes.  The actual comparison occurs when you do the Design Analysis.  Simply check the Compare with class templates for best fit option when you do the analysis.

clip_image001

The design of each database is then compared with the designs of the templates you want to compare to (see below).  This is similar to the existing feature for comparing the designs of applications and their assigned templates, but this is many-to-many instead of one-to-one. 

Note that this is also be available in the new “Recompute” function.  If you have already done the design scan for the database and the templates, you can adjust the list of Technical Classes to test for and re-run the Best Fit comparison without accessing the databases again.

clip_image002

The “best fit” for each database is displayed in two new view columns as well as on the Design Analysis tab.  A details dialog shows all the partial matches over a certain threshold.

image

In the above example, notice that the even though the database appeared to be a normal document library, it was only a 90% match to the standard Document Library template.  But it was a 100% match with the custom “Acme Enhanced Document Library”!  If you decide that you want to make that the official Technical Class for this library, just select it and press the “Set as Technical Class for Database” button.

BONUS!  To make this all easier to set up, we added a simple Create Technical Class action.  You can use this to select a Notes database or template and quickly create a Technical Class that uses it as the Reference Database and is otherwise preconfigured for Best Fit testing.  If it is a template, we also set up a class rule for recognizing it that way.

clip_image005

Follow

Get every new post delivered to your Inbox.

Join 28 other followers