Lotus Notes to SharePoint Blog

Blog about Dell's Notes Migrator to SharePoint tool and other things related to Lotus Notes migration projects

Category Archives: Notes Migrator for SharePoint

Q&A from Tuesday’s webcast

Sorry we could not get to all your questions live.  Here is the entire list of questions:

Q:  Is this event being recorded?

    A:  Yes, you can view the recording now at https://www.quest.com/common/registration.aspx?requestdefid=51835.  Sorry about the recording level – be prepared to crank the volume up.

Q: If a Notes application exists for archive purposes only – what type of data migration should occur to get off of Notes but keep the data for a period of time. I’m not sure if SharePoint is the right platform in some cases but just want to get your thoughts.

    A: Render with Form is the most common approach. See this post: https://notes2sharepoint.org/2010/11/12/render-with-form/

Q: What QuickR versions do you support?  R8, or also back to QuickPlace?

    A: All versions of QuickPlace and QuickR

Q: Can you speak to how often you have companies using straight out of the box (OOtB) templates from Lotus?  Rather, out of x number of apps, how many of those apps are OOtB? 

    A: Varies widely but maybe 30% apps are based on standard templates – doc libraries, discussions, team sites

Q: If you “skip all the custom code that those crazy notes developers put into place”, do you also have a service offering to help write re-training documentation/collateral?

    A: Dell and Quest PSO teams offer these services, as do our awesome partners.

Q: I’d love to hear more about “Find unused databases”.

    A: Our tool helps you find the last usage dates across all replicas of each database. Simply sort by date and pick whatever cutoff you want to use.

Q: what are the benefits of migrating to SharePoint vs. say Adobe Lifecycle?

    A: That’s a question for Microsoft. I’m just the humble migration guy.

Q: How do you do the design snapshots? 

    A: We basically build a mini-template – copying all the design elements to a local database in a special location.

Q: What’s the minimum access level on Domino end and on SharePoint 2013 (online) end?

    A: On Domino, you just have to be a reader. On SharePoint, it depends on what you want to do. To write docs, you need write access. To provision lists, sub-sites, groups, user access you need to be a site administrator.

Q: Does the free analysis tool support Office 365?

    A: Yes. Or maybe “Not applicable” is a better answer. Analysis just accesses the Notes stuff.

Q: How to map Domino user id to Active Directory accounts for permission migration?

    A: We give you a variety of options, ranging from a directory lookup (if there is some property in the Domino Person document that SharePoint will accept as a unique identifier) to a brute force XML file.

Q: Can your migrations also retain functionality like Domino Attachment and Object Store (DAOS)?  I know this is not a db design feature, but a Domino Server level feature.

    A: No, but I believe that this is all transparent to the Notes APIs we call. So it should all “just work”.

Q: Are the OOtB templates, you have considered in your design of this too, only from Lotus, or also from Vendors? 

   A: Only templates from Lotus. Really, however, designing jobs for new custom Notes templates is not that hard once you get the hang of it. The hard part is when you need to develop a new SharePoint template with complex functionality (i.e. something to migrate the content to).

Q: Do your tools have the capability to “decommission” a Notes app once migrated?  For example, either delete it wholesale, or alter the ACL, or add page that redirects the user to SharePoint?

    A: No, we decided never to modify the Notes environment. We get more trust from our customers that way, but it means an extra step from you.

Q: Do your tools also take Roles into account for the ACL, form/view access, document access, etc.?

    A: Yes, we have the option of generating SharePoint Groups from Notes Roles. We use these groups in document-level security. We do not attempt to migrate view/form access.

Q: Can lotusscript or fomulas be migrated?

    A: No.  That is usually a manual step, but it is usually best to switch over to OOtB features and declarative workflow anyway.

Q:  Does the “add to lookup list” generate a warning?

    A: Not a warning, but I think we would log adding new items if you have verbose logging enabled.

Q:  What is the specific improvement for doc with one attachment?

    A:  See this blog post:  https://notes2sharepoint.org/2013/04/01/filter-by-rich-text-content/

Q:  I would appreciate a follow up call to discuss the questions I posted to the panelists.

    A:  If you would like to talk about our products in greater detail, your Quest/Dell sales rep will be happy to arrange that.

PS:  A good place to post additional questions is in our SharePoint Notes Migration community site:  http://communities.quest.com/community/sharepointforall/notes

Public webcast later today

Important notice regarding SharePoint 2013 support in Notes Migrator for SharePoint

When we shipped Notes Migrator for SharePoint 6.2 last month, this included support for SharePoint 2013 but we realized that had one very unfortunate limitation: 

The Import Service did not support SharePoint sites that used Claims Based Authentication.  This was actually not a new limitation, and it impacted a handful of SharePoint 2010 customers, but it suddenly became very important in SharePoint 2013.  The reason is that all SharePoint 2013 sites use claims by default.  So this effectively meant that most SharePoint 2013 users would be forced to migrate via the native SharePoint web services.  Not a terrible option, but definitely not as fast.  (Migrating via the Import Service can be 3 to 5 times as fast, especially if you have large attachments and use the Share Files Folder feature.) 

Happily, we have now resolved the limitation.  Starting with hotfix build 6.2.0.1060, The Quest Import Service will work as expected even with the new SharePoint 2013 defaults.  This fix will appear on our web site in our next point release (6.2.1), but many customers will not want to wait for that.  Therefore I would encourage anyone who wants to use the Import Service on SharePoint 2013 to proactively contact Quest support and request build 6.2.0.1060 (or any hotfix build after that).   

Filter By Rich Text Content

In SharePoint, document libraries contain files. Every “document” is a binary file that you can download to your hard disk, etc. Notes document libraries are more flexible however. Some documents contain just one file attachment, while others may contain lots of rich text (and possibly multiple file attachments) in a rich text “Body” field. In fact, it is common to see mixed usage patterns in one document library, making it difficult to figure out the best migration target.

In the past, the best approach has been to migrate document libraries to a SharePoint list instead of a document library. SharePoint Lists mirror the flexibility of Notes documents well and allow you to capture your rich text (if any) in a “Body” field and migrate zero, one, or many attachments to the list’s attachment area. Unfortunately, you lose all the advantages of SharePoint document libraries with this approach. In the cases where you know that most documents are really just single file attachments, you would probably prefer to migrate those directly to a document library.

What many migration teams want to do is apply the following policy for document libraries:

  • For documents that contain just one attachment (and no other rich text), migrate the attachment directly to the SharePoint document library with all the appropriate security and metadata.
  • For documents that contain Notes rich text, generate a Word or PDF document and place it in the same SharePoint document library with all the appropriate security and metadata.
  • For documents containing neither attachments nor rich text, either skip the document or create a stub entry in the target library.

In order to implement this policy, Notes Migrator for SharePoint 6.2 now includes a new record filtering option for Notes and Domino.Doc data source definitions.

image

On the record selection tab, check the “Select documents based on Rich Text Content” checkbox. This will enable a Details button where you can specify further details. First specify one or more rich text items you would like to inspect. Second, specify the criteria you would like to use for filtering documents:

  • Whitespace only
  • One attachment only
  • Multiple attachments or other rich text

This new record selection option allow you to create multiple migration jobs for each document library, each one implementing one of the rules in the above policy. Remember that the Notes Migrator for SharePoint migration console makes it easy to sequence multiple migration jobs for one database, and to automate these jobs for many databases of the same type.

Also note that we are planning a similar feature for extracting QuickPlace and QuickR folders, but this is not available in the current build.

Enhanced support for Lookup Fields in Notes Migrator for SharePoint 6.2

In version 6.0 we added support for migrating content to SharePoint lookup columns. In version 6.1 we added provisioning of new lookup columns. Now here are two new advanced choices for using Lookup columns to help you design more powerful migration jobs.

Add Missing Choices

Normally when you migrate content to a SharePoint List that contains a lookup column, the Notes Migrator for SharePoint code tries to find the data value and correlate it a record in the lookup list. For example, you may be mapping a VendorCode item from your Notes data source to the “Vendor Code” column in your SharePoint target list, which is lookup column that connects to an indexed Vendor Code column in a list containing Vendor records. If you set the new “Add Missing Choices” property in your column definition to true, we will add a new record to the Vendors lookup list if one does not already exist. (This is now very similar to how “Add Missing Choices” works for Choice columns in our tool.)

image

Note that in order for this to work, the Lookup list should have reasonable default values for all the other columns (especially the ones that are required columns).

Override Lookup Column

A mapping such as the one described above (mapping a VendorCode item from your Notes data source to a Vendor Code column in your lookup list) is fine for simple cases. But sometimes the situation is more complicated than that, and you need to use some other value from the Notes data source to correlate the data.

Suppose for example that the Notes application used VendorTaxID to link things together but the new SharePoint template is designed to use Vendor Codes. With version 6.2 you can now specify an “Override Lookup Column” property to control how Notes Migrator for SharePoint locates the record in the target list to link to. In the following screen shot, note how the Tax ID column in the lookup list overrides the Vendor Code column that is configured in SharePoint. Now you can map VendorTaxId values from Notes and get Vendor Codes in your SharePoint lookup column in the end.

image

Another great use for this new feature is the case where the Notes application used response documents to link two types of records. For example, it is common to use response documents to link user Comments to a patent document, such as a team’s Action Items (shown below).

image

In order to correlate the Comment documents to the parent documents, we need to use the internal Notes universal IDs (a.k.a. UNIDs). As luck would have it, Notes Migrator for SharePoint always stores the UNID of each migrated Notes document in a hidden “NotesUNID” column in SharePoint. So you can use that column as your Override Lookup Column and simply map the parent IDs to the lookup column:

image

Note: this technique works perfectly with the SharePoint Blog site template. The Comments list has a lookup column that references the Posts list. Now you can correlate Comments with Posts using internal Notes UNIDs, which is a lot more reliable than using Titles (which may contain duplicates).

Direct Folder / Document Set Migration

In previous versions of Notes Migrator for SharePoint, users could map document metadata (for example the Category property of a Notes application or the {BinderName} property for Domino.Doc documents) that would cause folders to be created in SharePoint. Folders would be created as needed as documents were being migrated. This worked for most cases, but there were limitations that customers would occasionally ask about:

  • Since we only migrate folders as a “side effect” of migrating documents, there was no way to migrate empty folders.
  • Similarly, there was no way to create the folders ahead of time (before migrating the documents)
  • There was no way to set permissions, created/modified metadata, or additional data columns on the newly created folders

Now in Notes Migrator for SharePoint 6.2 we offer a way to do direct folder migration. This is really two separate features that work together…

Migrating records to folders

image  image

On the Advanced tab of your Target Data Definition, you can now indicate that you want to migrate to a folder in your target list or library.  In this mode of operation, every record you extract from the data source will result in a folder being created, instead of a document!  The only additional requirement is that you map at least one item to a target column of type Folder (which controls the new folder names). Many of the usual document migration features will now apply to folders including:

  • Mapping of permissions (using the “Map Reader/Author fields” checkbox on your Advanced tab)
  • Mapping created/modified metadata to folders (using the “Preserve Created/Modified” checkboxes on the Map Data tab)
  • Mapping additional data items to folders (requires creating a new Folder content type). 

Note that many job features that would apply to document migration will not apply to folder migrations. For example, document generation and duplicate document handling options would be disallowed in in this context.

Extracting information from Domino.Doc Binders

image image

One of the things that customers clearly want to do with this new feature is to migrate all the information from their Domino.doc Binders to SharePoint folders. To support this, we have added a new option to do exactly that in Domino.Doc Source Data Definitions. Simply check the Binders radio button on the Document Selection job, and now you are extracting Binders instead of Documents. Each row in the Preview represents a Binder in the current file cabinet and we have included additional columns for all of the standard Binder metadata available in Domino.Doc. Of course you can add additional columns to this query as well.

So putting these features together, you would typically map the {Title} property of your data source to a Folder column in your target. Simply checking “Map Reader/Author fields”, “Preserve Created/Modified identities”, and “Preserve Created/Modified dates” should bring over most of the other metadata but you can certainly add additional mappings if desired.

Note that this feature will only write new SharePoint folders; it will not update existing ones with the same name. So a best practice is to run the Binder migration job first (to create the folders with all the properties intact) and then run you normal document migration job.

Also note that we are planning a similar feature for extracting QuickPlace and QuickR folders, but this is not available in the current release.

Migrating to Document Sets

Similar to migrating to folders, Notes Migrator for SharePoint 6.2 also give you the ability to migrate directly to document sets. The situation here is very similar to what was described above. The tool already allowed creation of document sets as files within those document sets were being migrated. This is a powerful and popular feature, but suffered some of the same limitations.

  • Since we only create document sets as a “side effect” of migrating documents, there was no way to create empty document sets.
  • Similarly, there was no way to create the document sets ahead of time (before migrating the documents)
  • There was no way to set permissions or created/modified metadata on the newly created document sets separately from the documents.

The solution is similar to the folder solution described above. On the Advanced tab of your Target Data Definition, you can now indicate that you want to migrate to a document set in your target list or library.  In this mode of operation, every record you extract from the data source will result in a new empty document set being created, instead of a document!  The only additional requirement is that you add a target column of type DocumentSet and map a value to the DocumentSet.Name property. All of the other features of document set migration (described here) still apply. The difference is that every record you select gets mapped to a document set instead of a file within a document set.

Automatically ZIP file attachments

Many customers have requested the ability to compress Notes file attachments while migrating them to SharePoint. There are a number of good reasons for wanting to do this:

  • Save disk space on SharePoint server
  • Get around SharePoint file restrictions (i.e., blocked file extensions and/or size limits)
  • Improve the bandwidth sending data to remote SharePoint servers
  • Eliminate problems (hangs and memory leaks) when embedding certain types of file attachments inside Word documents

Notes Migrator for SharePoint 6.2 now allows you to achieve this in Notes and Domino.Doc migration jobs via a new property on Attachment columns in your Source Data Definition. The “Compress” property may be set to “None” (the default) or to “Zip”.

clip_image002

You can also configure a set of global exceptions to this rule on the Notes tab of the tool’s Options dialog. The “Compression Exclusions” option allows you to specify any file extensions that should never be zipped. This would typically include media files that are already well-compressed and would not benefit from zipping.

clip_image004

When the Compression property is set to “Zip”, all extracted attachments will be compressed and placed inside a ZIP file when migrated. As shown below, any icons or text links to the attachments will appear the same as before, but when the user clicks on them they will open up a ZIP file instead of the “raw” file.

clip_image006

Additional Notes

Zip files will always be excluded from further zipping, even if they are not specified in the “Compression Exclusions” list.

If you wish to use this feature to migrate files that were previously blocked by SharePoint (for example .EXE files), be sure to remove these extensions from your Blocked Files list on the SharePoint tab of the tool’s Options dialog.

When using this feature with Domino.Doc sources, add a custom Attachment column as described above. Map this custom column (instead of the predefined {Attachments} column) when mapping attachments to a target column.

clip_image008

This feature is not currently available to QuickR or QuickPlace.

Attachments in generated Word documents… use caution!

We have had a number of support cases on this over the last year, and we have made an effort to clarify our position on this in our latest release notes:

Migrating file attachments inside of MS Word documents is not recommended for large migration jobs. Since Word attachments are implemented as “Packager” OLE objects, the migration process is forced to invoke Packager code as well as the OLE handlers for the file type in question (often in separate processes). The problem is that each type of attachment is handled differently depending on which type of application created the attachment. So (for example) the first 1000 attachments may work fine and then document 1001 has a different type of attachment that causes a memory leak when Microsoft converts it into a Packager object. When migrating attachments and OLE Objects to embedded objects in MS Word, it is highly recommended that the workstation performing the migration has native applications installed that can open and edit every type of attachment that the migration jobs will be encountering during the migration.

The two recommended workarounds are:

  • Migrate attachments separately to the SharePoint document library. (The links from the Word documents to the attachments will be preserved.)

  • Place all attachments inside ZIP files inside the Word documents. (This is a new feature for version 6.2.)

If you still insist on embedding attachments into Word documents, you should understand that you are doing so at your own risk and that Quest support will probably not be able to help you if problems occur. Here are some tips that have helped other customers in the past:

  • There are known issues with packaging Adobe Acrobat 10 objects. If you have PDF attachments, be sure to install Adobe Acrobat Reader version 9 on the workstation.

  • Packager leaks usually have to do with cases where packager cannot use type-specific COM objects (including all .MSG files, and including .PDF files when the PDF reader is not installed)

  • Leaked resource handles accumulate per Windows process. This means that you should restart NMSP Designer between large jobs, even if they are successful. It also means that batches of jobs in the NMSP Migration Console would still not be a good idea.

  • There may other issues with packaging certain types of objects. Since we cannot product how third-party OLE handler will operate, this is beyond our control.

Introducing Notes Migrator for SharePoint 6.2

This release will be downloadable from the web site in a couple days. 

The biggest feature by far is full SharePoint 2013 support.  Everything you could do in SharePoint 2010 you can now do in SharePoint 2013 and the new version of Office 365.  Its that simple!

The other features from our release notes are:

  • Zip attachments while migrating
  • Support for Pass-through HTML
  • Special handling of documents containing just one attachment
  • Full folder migration (example: Domino.Doc binders)
  • Support for multiple Notes passwords
  • Improved lookup field support (add missing choices)
  • Improved support for migrating to Claims-based environments
  • Render with form improvements (computed subforms, etc.)
  • Discovery process is more robust

Some of these features will be more self-explanatory than others.  I will be posting detailed walkthroughs of a lot of these features here in coming days.  You can also check out the “what’s new” section of the user guide or watch my recorded “update” training here: http://communities.quest.com/docs/DOC-14812

Checklist for planning the migration of a Notes Database

Before migrating a Notes database to SharePoint, there are a variety of details to gather and choices to make. Thinking through the items on this checklist will help ensure a smooth migration process with minimal surprises.

Source database details

  • Source type (Notes, QuickPlace, QuickR, Domino.Doc)
  • Source name
  • Notes template used (if any)
    • Level of customization from template
  • Domino server & path (closest replica to migration team)
  • Special Notes identity required to access database (if any)
  • Public keys needed to decrypt documents (if any)
  • Approximate number of Notes documents to be migrated
  • Approximate size of Notes content to be migrated
  • Desired priority of migration
  • Desired phase/group of migration (intended to capture clusters of databases that should logically be migrated together)
  • Documents that should / should not be migrated
  • Unique “document types” to be migrated (this usually correlates to distinct Notes forms that back existing data documents)
  • Design Details that will need to be accounted for
    • Forms
    • Workflow
    • External Connectivity
  • Third Party validation/security auditing
    • PCI Compliance
    • FDA Compliance
    • HIPA
  • Security and permissions that will need to be maintained and updated
  • Archiving of data
    • Date Range
    • Data Type

Target Site Details

  • SharePoint site URL
  • Account to be used or provisioning and content migration
  • New or existing site?
    • Template to use for provisioning site (may be custom developed)
    • Site provisioning details – parent site, inherit permissions
    • Provision site security (user permissions, group permissions, roles)?
    • Are specific content databases required?
  • Source of identity mapping information
    • Users
    • Groups
  • Out-of-box SharePoint features to enable/leverage
  • Do file size limits and blocked file types need to be changed from the defaults
  • Site Columns in use
  • Content Types created
  • Workflow type and creation
  • Search and Indexing

Target List/Library (specify for each document type if appropriate)

  • Target list or library name (Naming schemes in use)
  • New or existing list?
    • Template to use for provisioning list (may be custom developed)
    • Add to Quick Launch
  • Provision list/library security (user permissions, group permissions, roles)?
  • Allow tool to upgrade existing schemas if needed?
  • Desired folder structure within list or library
  • Map created/modified identities/dates
    • Behavior if user mapping fails (fail or substitute default user)
  • Out-of-box SharePoint features to enable/leverage
  • Third Party solutions (templates, web parts, etc.) in use or available

Mapping details (specify for each document type if appropriate)

  • Content Type for migrated documents (if any)
  • Target document format
    • List Items
    • Extracted binary file attachments only
    • Generate InfoPath docs (which template; target Forms Server)
    • Generate Word 2007 docs (which template)
    • Generate Adobe Acrobat (PDF) documents
    • Generate Web Part pages (which template)
    • Generate Content pages – Publishing pages, Basic pages, Wiki pages
    • Generate HTML files
    • Generate MIME files
  • Use separate library/folder for attachments, images, embedded objects?
  • Field mappings
    • Source data item/column names and types
    • Data transform required (example: translation of keyword values)
    • Target data column names and types
    • Using Lookup columns?
    • Using Managed Metadata columns?
  • Migration options
    • Map document level security (readers / writers)?
    • Convert DocLinks to dynamic links (i.e. use the Link Tracking Service)?
    • Map discussion/response hierarchies
    • Map calendar logic (repeating meetings)
    • Map approval status / workflow state? (How expressed in Notes app?)
    • Map version histories (How expressed in Notes app?)
    • Map data in date ranges
    • Render with form (archive only)

 

NOTE: Two important items were not covered in the above checklist:

  • Development work that may be needed in SharePoint Designer, InfoPath, Visual Studio to reproduce the custom logic you had in Notes.
  • The opportunity to reduce your migration effort by consolidating similar application designs, reusing custom development work, and automating large scale migrations.

Both of these topics are discussed in detail in the Quest white paper “Migrating Lotus Notes Applications to Microsoft SharePoint: Understanding Application Complexity and the Value of Consolidation and Automation” which may be download from http://www.quest.com/documents/landing.aspx?id=9746.

Online Documentation is live!

The full Notes Migrator for SharePoint documentation set installed with the product (User Guide, Installation Guide, Release Notes) is now also available online at  http://documents.quest.com/Product.aspx?id=20.  This will become a standard for many Quest products, but I am happy to say that we were one of the first!

One thing I like about this new facility is how the search works.  Depending on where you start from, you can search one particular document, all Notes Migrator for SharePoint documents, or all Quest documents.

Exporting attachments and generated files to the file system

In case you missed it:  in Session 6 of my Partner Training webcast series, I introduced a free utility for reorganizing the intermediate files that Notes Migrator for SharePoint writes to disk.  Not everyone was clear on what such a utility was good for, so I thought I should elaborate:

Every once in a while, we hear one of theses requirements from customers:

  • Extract all the attachments from a Notes database and place them in the file system (instead of SharePoint)
  • Convert Notes documents to Word Documents and place them in the file system (instead of SharePoint)
  • Convert Notes documents to PDF Documents and place them in the file system (instead of SharePoint)
  • Convert Notes documents to InfoPath Documents and place them in the file system (instead of SharePoint)

The trick to doing any of these things is to use our intermediate data file format.  Just design a Notes Migrator for SharePoint migration job and when running the job, select Save to Intermediate File.  This will give you the option of saving any attachments or generated documents as disk file rather than encoding them in the main XML output file.

image

The problem with this approach is that the files that are saved to disk are not really that useful because they are arranged by Notes UNID, not any way that would be useful to humans.  That’s where the free utility come in.  This utility, which may be downloaded from our Notes migration community site, will take all these files and place them in the folder (or network share) of your choice and arrange them according to your requirements.  If you want to see an example of this tool in action, watch the video of Session 6.  Slides starting at about minute 77 and demo starting at minute 91.  Have fun!

Notes Migrator for SharePoint and Quest Web Parts collaborate to deliver support for migrating Notes Sections

The best kept secret in Notes Migrator for SharePoint 6.1 is that we added full support for migrating Notes sections.  That’s right, I’m talking about the expand/collapse sections with the little triangles in Notes rich text items.  When we migrate any rich text that contains such sections, we generate DIV tags for the section headers and section bodies in the resulting HTML. 

The problem, of course, is that these tags do not do you much good because, by default, SharePoint does not know how to use them.  That’s where Quest Web Parts for SharePoint version 5.7 comes in.  If you place our rich text web part on your page, all those migrated sections suddenly start working!  Even better, users can continue to create expand/collapse sections in new documents.  Won’t your Notes users love that?

image

See this post on the new version of Quest Web Parts for SharePoint.  If you scroll to the bottom, you will see a section “Improved Integration with Notes Migrator for SharePoint” and a really cool video describing what is going on.

Quest Web Parts for SharePoint has always been popular with Notes customers.  Features such as multi-level threaded discussions, dynamic navigation areas, and tabbed form layouts has been helping migration teams satisfy the requirements of their migrated Notes applications (at a drastically lower cost than hand-coding those features) for years now.  Of course it is not just for migrated Notes apps, but is really useful of many other SharePoint development scenarios.

Notes Migrator for SharePoint 6.1 Reviewers Guide

Notes Migrator for SharePoint 6.1 is a significant product release that pushes the product further in three important areas:  Design Migration, Content Migration and Pre-Migration Analysis.  This release will be “Generally Available” on the Quest web site in a few days.  If you were in the beta program you already have the final build (6.1.0.626).  Below is a high level list of this release’s most important new features.

Design Migration

Migrate Notes Views to SharePoint Views – This feature will allow you to select a Notes view and generate a similar view in any SharePoint list or library.  Because the mapping of columns in your view is intimately tied to the mapping and provisioning of data columns that occurs in your migration job, this new capability is also tied to your migration job.  Specifically, view migration is found on the new Advanced tab of your Target Data Definition.  You can design new views there and the views are provisioned when the migration job is run.  A nice side effect of this model is that you can run the same migration job against many new or existing lists or libraries.

clip_image001

clip_image002

Press “Import from Notes” and then select the Notes view you want to migrate.  The view migration wizard will then attempt to design an equivalent SharePoint view, mapping as many Notes columns as possible to existing SharePoint columns OR to new columns defined in your job’s target data definition.  As you might expect, there are a number of things that can go wrong with such an automated mapping.  The two most likely issues are (1) the Notes data item shown in the view had not been migrated to the SharePoint list yet or (2) the Notes view column is computed and the formula is too complicated for a tool to convert.  For issues like this, manual intervention is required and the migration wizard guides you through that.  In the View Columns step, it gives you a side by side view of the Notes and SharePoint columns and highlights the parts that it needs help with in red.  You can decide to manually map a Notes existing columns, define a new data column (which will be added to your migration job), or even specify a formula for a new computed column in SharePoint.  Similarly you can also specify the sort order, the grouping and the document selection rules, or you can just take the defaults. 

Content Type Generation – These feature takes the tool’s ability to migrate schema (select fields from a custom Notes application and provision a similar schema in a custom list) to a new level.  Now you can migrate your custom Notes application schema to SharePoint content types instead.  You can create new Content Type Definitions on the new Content Types tab of your Database or Technical Class records in the Migration Console.  You can design them from scratch or, more likely, you will generate the Content Type definitions from a Notes Form, from an existing migration job, or from an existing list that you have been working on.  (The last two options allow you to do your initial development and testing on a custom list and then transition to content types later.)

clip_image003

clip_image004

Once you have generated an initial Content Type Definition, you can further customize it, specifying the parent content type, the group, and the columns.  The columns editor looks like the Target Data Definition columns editor, but in this case you are defining Site Columns instead of List columns.  When you are ready to provision your new Content Type, the tool will try to find existing Site Columns that match your specification and will provision new ones if needed. 

clip_image005

clip_image006

New Column Provisioning Options – The ability to provision lists and libraries while migrating content has long been a popular feature.  This capability has been greatly expanded to allow provisioning a nearly complete list of SharePoint column options, such as default values and data validation.  You can even set columns to be read-only (something you can’t normally do in the SharePoint user interface).  Note that as before, the provisioning options only apply when the tool is first provisioning the columns; it does not upgrade existing columns.

clip_image007

Provision Calculated columns You can now define Calculated columns in your target data definition.  These will be provisioned in SharePoint when your migration jobs are run. This Microsoft documentation page explains the legal syntax for SharePoint formulas:  http://msdn.microsoft.com/en-us/library/bb862071.aspx.  You can also use our new built-in formula editor, which appears at several points in the product.

clip_image008

clip_image010

Provision Lookup Fields – Lookup fields can now be provisioned in SharePoint lists.  Previous versions of the tool allowed you to migrate to Lookup fields, but you first had to manually configure them in SharePoint. You can even configure lookups from lists in other sites in the same site collection (something you can’t normally do in the SharePoint user interface).

clip_image011

Content Migration

New Run Job options – During normal migration jobs, Notes Migrator for SharePoint first takes care of the provisioning steps (creating the list/library if needed, adding the appropriate content types to the list, adding any missing columns if needed, and creating views, setting the list/library permissions) and then migrates the content.  Now you can now choose to run just the provisioning bits without migrating the content.  This will be useful in cases where you want to review the resulting list schema and perhaps make changes to the list settings before migrating content.

clip_image013

clip_image015

Also note that the above screen shots show off the tool’s new Run Job button with the drop down menu of various modes of running the tool, such as importing data to and importing data from intermediate XML files.  Since this button is also available in the Migration Console, users can now leverage the intermediate file options from there (it was formerly available in the Designer Client only).  Similar functionality is also available in the tool menus and the bulk Migrate To SharePoint task in the Migration Console.

Migrate by Form Used – When selecting records from your Notes database, you can now elect to query records by form.  You can pick one or more Forms from the Notes database design or type them manually.  This will limit the records selected to include only those data records that were flagged with the forms you selected (i.e., where the “Form” item was set to one of those form names).

clip_image017

clip_image019

If you specify forms in this way, the Select Items and Columns dialog will also display the data items defined on your selected forms.  (Remember, however, that background agents, etc., may have also set data items on the Notes documents, so you may still need to look at the Sampled Items node to discover those.)

Improved Content Type Support in Migration Jobs – Notes Migrator for SharePoint already had pretty good support for dealing with content types while migrating content, but this release extends those capabilities and also makes it much easier to use.  The new Manage Content Types section on the first tab of the Target Data Definition allows you to push site content types into a new or existing SharePoint list.  Adding the content type to the list is a prerequisite to writing documents with those content types, so this new feature eliminates that manual step and make more automation possible. 

clip_image021

In addition, you can now associate each content type with one or more Notes forms.  If you do this, then the tool will automatically assign the content type for each migrated document without the need to do any explicit mappings.  (For more complex scenarios you can still explicitly map any piece Notes data, including the result of a formula evaluation, to a ContentType field.)

Note that this feature is completely independent of the ability to generate new Site Content Types, as described above under Design Migration, but you may often find yourself using the two in combination.

Normalize multi-valued items to multiple documents – It is common for Notes forms to implement detail records (the things that relational developers would have set up a one-many-relationship for) using multi-valued data fields arranged to look like a table, as shown below. Users would then fill in as many entries as the needed, making sure that the values from each multi-valued item lined up properly.

clip_image023

clip_image025

Now you can use Notes Migrator for SharePoint to extract the multi-valued items as separate records. To do this set the Multi-Value disposition option for the columns that you expect to contain arrays (ContactName, ContactTitle, and ContactPhone in the above example) to “Normalize”. This will cause NMSP to generate multiple SharePoint items (three in the above example) for the values in the columns designated for Normalization.

clip_image027

clip_image029

Note that if you also wanted “header” records you could first migrate those using a separate migration job. You could use lookup fields or some other mechanism to relate the “header” and “detail” columns. In the above example, Customer ID in the Contacts list is set up as a lookup field that references the Customers list (where Customer Name and Category are also stored).

Finally, you may encounter Notes forms that list out each item in separate rows, instead of using multi-valued items. NMSP can handle that case as well, but the migration team will need to do a little work with formulas to generate multi-valued columns to Normalize on.  For example, you could define a source data definition column with the formula “Product_1: Product_2: Product_3: Product_4” to generate a “Products” array.

clip_image031

Dynamic link tracking for URL columns – When migrating Notes data to SharePoint URL fields, you can now leverage the tool’s popular Link Tracking Service for those links (the same way you currently do for Doc Links in Rich Text Fields).  You do not need to do anything special to enable this.  Simply start migrating data items that store notes:// links (or formulas that generate notes:// links) and they will be converted to dynamic links that will ultimately point users to the migrated versions of those documents. 

As a reminder, the tool supports two input formats for Url fields.

  • <url> –  sets the url to be the display name in SharePoint
  • <url>, < name> –  sets the url and display name separately in SharePoint

Improvements in “Render With Form” function – This very popular feature, which allows you to extract the content of any custom Notes database (regardless of the complexity) so you can archive them as simple rich text documents in SharePoint, has been improved in a number of respects. Handling of dynamic subforms, computed fields, computed text, and keyword fields have been improved, resulting in even better looking rendered documents.

Provision Alternate Libraries – Users have often relied on the tool’s unique ability to migrate images and attachments to a separate location from the main documents being migrated.  For example, you might want to place all attachments in the Shared Documents or Site Assets document library.  Previously you would have had to provision these alternate libraries manually before running your migration job.  Now the tool provisions these automatically if needed.

clip_image032

Analysis / Migration Console

Capture Design Copies during Analysis – Now when you perform a Design Analysis on a set of databases you can elect to also create a local design copy.  For every database scanned, the tool will create a small copy of the database (design elements only, similar to a design template) on the local machine on which the design analysis is being performed.  The intent of this feature is to allow for consultants and other migration personnel to be able to view the full database designs even while they are disconnected from the production environment, which is particularly important on large analysis projects where in-depth manual design analysis is required.  The location of these databases is controlled by a setting on the Analysis tab of your Global Options dialog, and defaults to a folder in the tool’s <ProgramData> area.  Users can the easily open the Local Design Copy from any database view.

clip_image033 clip_image034

Classify by Last Used (All Replicas) – This enhancement expands the available options for automatically determining which Technical Class or Business class a databases belongs to.  Now you create a rule for recognizing class members based on when it was last used across all known replicas.  For example, you can create a class that groups all databases with a Last Used date greater than 356 days ago.

clip_image036

Import data into repository from CSV files – This enhancement allows users to read in records from a CSV file and update vales in certain Quest Repository database records.  The tool is available on the main Notes Migrator for SharePoint node.  First the user is prompted to select a CSV file and then a mapping dialog is displayed.  For each column in the CSV file, the user can choose to map it to one of the available database record properties.  Not all database properties can be imported from external data sources.  The intention here is that users who perform manual analysis/triage projects using spreadsheets or external tools can import that data back into the Migration Console.

This feature can also be used to add new database records into the repository (by Server and File Path, Replica ID, or Database Key).  The intention here is to allow users to import a list of databases to be analyzed or migrated.  Users would typically follow up such an import with a full analysis of those databases to populate the remaining database properties.

clip_image038

General

Performance and scalability for Migration Console – The database views have been re-architected to support 60,000+ databases at a time.  With a very large repository, users may experience a delay while starting the tool as all the records are loaded into memory.  After that, scrolling, sorting and filtering database views and opening additional views should be very fast.  Special protections have also been added to prevent you from running very large reports that are likely to crash the migration console.

New User/Group Mapping options – When performing user/group name mapping, the Output Translation option now allows you to further transform the name that results from your name lookup before submitting that name for resolution in SharePoint.  This may be useful when your environment requires specific name formats that are not immediately available in your mapping source.

Also, the Test User/Group Mapping tool now has a Validate in SharePoint button which will try to resolve the name that your configured mapping process produces so you can verify that it really works in SharePoint. This should make experimenting with and debugging various user mapping options a little easier for everyone.  Note that this capability is only available when using client-side user mapping (not when configuring server-side mapping in the Import Service).

clip_image039

Improved CBA/FBA support – Connections to SharePoint sites using Forms Based Authentication now automatically renew cookies as needed during long migration jobs and other operations.  The tool will routinely check the expiration time of the authentication tokens that it holds for the client and, if a token is due expire within a certain window, it will force a new authentication.  (The time limit will be 30 minutes, but this can be changed in your SharePoint Environment settings.)  Depending on your particular authentication system, this may appear to the user as a browser prompt forcing the user to re-authenticate.  In other cases, it may appear as a browser window that opens briefly and then closes again.

clip_image040

Windows Authentication using alternate account for Link Tracking Database One new connection option is available for connection to the Link Tracking Database.  You can now specify that you want to use Windows Authentication, but supply an account other than your own.

clip_image041

Bulk Editing of certain database / class / job properties – Expanding on the tool’s capability to select multiple databases and set properties in bulk, you can now set a number of additional properties in a large number of selected databases.  You can even set certain properties inside the migration jobs assigned to the selected databases.  Finally, you now have a similar set of options for Technical Classes and Business Classes as well.

Usability improvements A number of things have changed in the product to improve to make the tool easier to use, including rearrangement of dialogs and additions of menus.  More pop-up help icons and more context sensitive links to help topics have been added.  A completely rebuilt “Add/Remove Columns” dialog and the built-in documentation describing the 170 possible built in columns makes customizing views much easier.

clip_image043

clip_image045

Introducing the Notes Migrator for SharePoint 6.1 beta program

Notes Migrator for SharePoint 6.1 is a significant product release that pushes the product further in three important areas:  Design Migration, Content Migration and Pre-Migration Analysis. 

Design Migration

  • Migrate Notes Views to SharePoint Views
  • Content Type Generation
  • New Column Provisioning Options
  • Provision Calculated columns
  • Provision Lookup Fields

Content Migration

  • Migrate by Form Used
  • Improved Content Type Support in Migration Jobs
  • Normalize multi-valued items to multiple documents
  • Dynamic link tracking for URL columns
  • New Run Job options
  • Provision Alternate Libraries

Analysis / Migration Console

  • Capture Design Copies during Analysis
  • Classify by Last Used (All Replicas)
  • Import data into repository from CSV files

General

  • Performance and scalability for Migration Console
  • New User/Group Mapping options
  • Improved CBA/FBA support
  • Windows Authentication using alternate account for Link Tracking
  • Bulk Editing of certain database / class / job properties
  • Usability improvements

 

Once again, the beta will be managed on the new SharePoint for All community site.  If you would like to participate in the beta program, go to http://communities.quest.com/groups/notes-migration-product-beta-group.  Sign in with your Quest Community ID, or register to create a new one.  Then press the “Ask To Join This Group” button.

image

One of the site owners will review your request and will typically approve it the same day.  You will receive a notification and then get full access to technical content and (of course) the beta build itself.

Quest Support will not be able to help you with this version until it releases, so please use the group’s Discussion area for any questions, problems or suggestions.

New Webcast: Migrating Lotus Notes Applications to SharePoint Online in Office 365

How should I connect? How do I link? Which solutions do I need to install?

[Note: I am updating this old post to reflect the latest migration options in Notes Migrator for SharePoint 6.0.1.  Specifically, the “Lightweight Migration Service” is no longer needed.]

Connection options

Notes Migrator for SharePoint 6.0 now supports two very different ways to connect to SharePoint sites in order to migrate content to them. 

1. Quest Import Service.  The “classic” way is to install the Notes Migrator for SharePoint Import Service.  This is a stand-alone IIS web application that you run on one or more of your SharePoint front-end server boxes.  You have to directly access (or remote into) a SharePoint front-end server, run the NotesMigratorForSharePoint-Services-64bit-6.0.0.x.msi setup program, and select the “Import Service” option.  You need to be a farm administrator to install it and think about service accounts, permissions, etc. 

You also have to configure every new SharePoint site collection you create to use a particular Import Service instance.  To make this possible you also need to install the “Front End Services” solution included in the same NotesMigratorForSharePoint-Services-64bit-6.0.0.x.msi setup program.  Unless you are putting these components on different physical machines, you would simply install both components at once, which is the default. 

The Quest Import Service is definitely not trivial to install, but it is by far the most powerful and best performing option.  It should be noted, however, that there are three cases where the Quest Import Service cannot be used at all:

  • You are migrating to Office 365 (SharePoint Online Standard or Dedicated)
  • Your SharePoint site is using Claims Based Authentication
  • Your administrator refuses to install third-party code on your SharePoint environment

image

2. SharePoint 2010 Web Services.  The new option, for 2010 customers only, is to migrate via Microsoft’s new out-of-the-box web service.  This is much simpler to deploy – in fact there is often no need to deploy anything on your servers at all (see below).  There are really only two disadvantages to using this approach.  First, it can be significantly slower than running migrations via the Import Service.  Second, there is a slight limitation to how our Link Tracking Service works.  As you will see below, everything works in the end, but the user experience suffers a little until you finalize your links.

Linking options

Related your choice of connection options is the choice of Link Tracking Service options.  The Quest Link Tracking Service is an optional feature that keeps track of all the Notes documents you have migrated and dynamically redirects users to the current location.  I won’t go into all the details of the service here, but I want to focus on how the Link Redirector page works. 

If you enable the Link Tracking Service, every Notes DocLink (or HTTP link to a web enabled Notes document) in every migrated document gets converted to an HTTP link to a Link Redirector page (QuestLinkTracking.aspx).  This redirector page typically performs a lookup in a centralized Link Tracking database and then dynamically redirects the user to another migrated document in SharePoint (if it has been migrated) or to the original Notes version (if it has not yet been migrated).  So the natural question here is: Where does this Link Redirector page live and how does it get installed?

There are actually now two different versions of the Link Redirector page that you can choose from.  First is the “classic” one that you get when you install the Front End Services solution described above.  This one is configured on a per site collection basis, alongside the Quest Import Service.

An alternative version is the Sandbox Link Redirector page.  This version is intended for cases where you do not have the ability to install custom solutions and/or you cannot establish SQL connections from your server to the a shared Link Tracking database.  The main case we were thinking of when we designed this solution is Microsoft’s Office 365 environment and other highly secured hosting environments, but there will be plenty of people who prefer this option even for on-premises environments.  This page is packaged as a simple SharePoint solution (Quest.SandboxLinkRedirector.wsp).  Because it is a sandbox safe solution, it can actually be installed by any site collection administrator, even on locked down environments such as Office 365, without involving your farm administrators at all.

image

Note that because Sandbox Safe Link Redirector page does not connect to an external Link Tracking database, it always offers to redirect user to Notes, even if the document had been migrated to SharePoint.  In this scenario, users will not actually get redirected to their new SharePoint documents until their links are Finalized.  The saving grace here is that you can Finalize your links as often as you want to.  In productions migrations, customers often choose to Finalize links on a daily basis.

image

Putting it all together

Wow that is a lot of options and choices here!  Let me try to simplify things with a nice table.

Migration mode Quest Import Service SharePoint 2010 Web Services
SharePoint versions 2007, 2010 2010
Office 365 (BPOS) Dedicated only Dedicated and Standard
Server installation Administrator must run MSI, etc. None
Server configuration Per site collection None
Performance Fastest Slowest
Functional limitations None None
Link Tracking Service Full Dynamic Link Redirection (via Front End Services solution) Limited Redirection (via Sandbox Link Redirector solution) *

* NOTE: Strictly speaking, it is possible to install the Front End Services solution (with full dynamic link redirection using a Link Tracking database) even if you are not installing the Quest Import Service.  We believe, however, that most people will either want to install the full solution or keep things as light as possible and will not often mix and match.

Utilizing Content Types during a Notes to SharePoint migration (Part 2)

Part 1 of this article [link] describes how SharePoint Content Types might be useful when migrating multiple Notes “document types” to a single SharePoint List, Document Library, or InfoPath Forms Library.  Now we will take a look at how this can be accomplished using our tool, Notes Migrator for SharePoint.

First a recap of the Content Type features in the Notes Migrator for SharePoint documentation:

  • To assign the Content Type of a SharePoint item, simply add a Text field named “ContentType” and map the appropriate input data to it.
  • The ContentTypes property in a Source DataDefinition field indicates that the data field should only be migrated SharePoint for certain Content Types.  Leave this property blank to always write the field, regardless of content type.

The following example walks through the steps needed to create a SharePoint list using Content Types and then migrate Notes content to it:

Our example Notes database is a fairly typical Notes document library except that it has three types of documents in it:  Tech Notes, Code Samples, and FAQ Documents.  The “document type” is really controlled via a DocType field (not separate forms as described in Part 1).  All documents contain standard fields such as “Subject”, “Category”, and “Product”.  Code Sample documents contain an extra “Language” field and FAQ Documents contain “FAQ Type” fields.

image

On our SharePoint site we created a corresponding List with three content types similar to the ones we had in Notes:

  • Content type “Tech Note” has columns “Title”, “Product”, “Details”, “Url” (a “link” field).
  • Content type “Sample” inherits from “Tech Note” and adds column “Language” (a choice field).
  • Content type “FAQ” inherits from “Tech Note” and adds column “FAQType” (a choice field).

Note: I intentionally designed the Notes “FAQType” field to have a slightly set of choices than the WSS “FAQType” field so we can demo data massaging.

In the following screen shots, notice that certain columns only apply to certain Content Types:

image  image

 

For the migration job itself, I selected all the data fields, including the DocType field as well as fields that were specific to certain document types, in my Source Data Definition.  Nothing unusual here, except that I did a little “data massaging” along the way:

  • The “Product” column is really the Categories item with an alias.
  • The “FAQType” column is a Formula column that massages the data during migration: 
    @If(FAQType=”How To” | FAQType=”Problem”;”Solutions”;FAQType)
  • The “OnlineLink” column is a Formula column that computes the URL to the document on the Proposion web site.

image

In our Target Data Definition, we added a SharePoint field called “ContentType”.  We can map any source column we want to this field, and the Content Type for each new document will be set accordingly.  In our case, of course, we will map the DocType column to this field.

image

For the “Language” and “FAQType” fields, which are only for certain Content Types, we specified the content type in the ContentTypes property of the Target Data Definition field definition.

image 

The result of running this migration job is a SharePoint List populated with mixed types of content:

image

Utilizing Content Types during a Notes to SharePoint migration – Part 1

Content Types are a very powerful way to express the concept of “document types” or “business objects” in SharePoint. 

In the Notes world, it is quite common to design an application that has multiple types of documents in it.  Even though Notes databases are unstructured and (by design) lack any type of schema, good Notes developers usually adopt certain patterns and practices that impose some order in their applications and usually this includes establishing distinct, meaningful document types.

From a development perspective, this is largely driven by the different Forms used to create Notes documents.  A great example of this is a standard Notes mail database, which has Memos, Tasks and Calendar Entries.  If you want a business application example, think of Customers, Customer Contacts, and Sales Reps in a CRM database.  Each of these “document types” has different fields in it and each is created with a different Form. 

image

While Notes Forms has several other uses (for example, to display a pop-up dialog), the Form used to a create a document becomes the de facto “document type” in most cases.  An application may contain one or more Views or Folders for each document type (for example “All Tasks by Priority” or “Open Tasks”).  But it is also common to design a View or Folder that shows multiple document types together (for example showing Customers and Customer Contacts together in the same view).

A common variation on the above pattern is to use different forms to express extended document types (what object oriented developers would call “sub-classes”).   An example of this would be an application that had both Customers and Government Customers.  While a Government Customer is a type of Customer, it has additional fields and may have additional data validation rules, different security, etc.  You could easily imagine a View that shows “all customers” together without making any distinction, but when you opened a Government Customer record you would see something different than if you opened a “plain” Customer record.

The notion of defining different document types is not limited to Notes applications.  QuickPlace sites have several different default “Page Types” and users can create their own custom Page Types with additional fields.  Similarly Domino.Doc libraries can be extended with both custom Document Types and custom Binder Types.

So how do these concepts translate to the SharePoint world and how do you you deal with them when migrating?

When migrating a Notes application, a QuickPlace site or a Domino.Doc library to SharePoint, developers typically do one of three things.  Note that each of the following choices apply equally well whether you are migrating to a SharePoint List, Document Library, or an InfoPath Forms Library:

  1. Split different document types into a different Lists.  You could easily run one migration job that moved all Customer records to one list and another job that moved all Sales Rep records to a different List in the same SharePoint site.  This would be a particularly good idea if, for example, you wanted Customers to have a “Rep” field that was a Lookup field from the Sales Rep list.  It is less clear, however, that it would be good to split up Customers and Government Customers this way as you probably want them to stay together.  Notes Migrator for SharePoint allows you to select records by View or by Formula; if you can not find a View that selects the documents you want you can use a Formula such as: FORM = ‘GovernmentContact’
  2. Merge different document types into a single uniform List schema.  For example, you could migrate all Customers to a single Customers list and discard the extra fields that was in Government Customers (a “least common denominator” approach).  Or you could include all the “government” fields in your new Customer definition and just leave these blank for the records they don’t apply to (a “grab everything” approach).  You might even add a new “Customer Type” field that recorded whether or not it was a Government Customer in Notes.
  3. Use Content Types.  The pattern of using Forms to express document types in Notes applications maps almost perfectly to SharePoint “Content Types”.

image

There is a lot of information on the web about Content Types.  I like Andrew May’s blog [link].  The Microsoft SDK documentation [link], which can be pretty intimidating to read, introduces Content Types this way:

Content types, a core concept used throughout the functionality and services offered in Windows SharePoint Services 3.0, are designed to help users organize their SharePoint content in a more meaningful way. A content type is a reusable collection of settings you want to apply to a certain category of content. Content types enable you to manage the metadata and behaviors of a document or item type in a centralized, reusable way.

One of the reasons Content Types seem so intimidating to new SharePoint developers is that they are used for so many things.  Not only are they used to express which document types contain which fields (the important part for people migrating content), they can also be used for the following:

  • Custom user interfaces
  • Templates for creating new documents
  • Workflow rules
  • Microsoft Office integration
  • Inheritance from “base” Content Types
  • Reuse across multiple lists in a SharePoint site

The good news is that using Content Types for simple things (i.e., defining multiple structured document types in a single list) is actually pretty simple.  In spite of what some of the literature implies, you do not need to be a developer or create features or even use XML to create document types.  I believe that as people come to understand them better, they will become more widely used.

image

Notes Migrator for SharePoint has the ability to set content types in migrated documents based on the Form that was used in Notes, or whatever other criteria you want to use.  Part 2 of this article will discuss how we make that happen and will include a working example.