How to set up your Migration Profile in DocAve

Post Date: 11/08/2017
feature image

In the tech world, especially in the world of SharePoint, migration holds a much different meaning than to those outside of the IT bubble who most likely associate the term with birds flying south for the winter or those lucky retirees who bask in the warm weather of southern states during colder months. For us tech types, migration means a much different experience, however with the same core concept. In SharePoint, migration consists of the transference of data from one version of SP to another or from an on-premises environment into the cloud. Like any move, it’s a hefty undertaking. And, like a moving company complete with a strong crew, big trucks, and all the dollies and bungee cords, we’re here to help!

In this how-to video, we’re going to show you how to set up your migration profile settings in DocAve. The reason these settings exist in a profile is to save you the time of configuring settings for each migration job individually. To get started, click on the profile settings button on the DocAve toolbar.

You will see that you can select your profile from a dropdown menu, including the default profile, any of which can be set as your default.

Now, were going to go through the settings to explain each one. First, in ‘source component options,’ start with the filter policy. This determines what is going to be migrated. If there’s no filter policy, everything will be migrated. If you set a filter policy, it’s an inclusive filter policy, so anything that meets the condition will migrated as well. For instance, if you only wanted to migrate documents that have been modified in the last three years, you can migrate that over with the proper filter policy setting, and everything else will be ignored.

Be careful when selecting alerts, because if the destination environment is live, active, and able to send emails, then all users will get emails for every single alert and every document that is uploaded.

If you’re using workflows, you’re definitely going to want to bring over the definition. If you have any running workflows, you need to determine how you want to handle those, because you can’t keep the running workflow in the current stage. It will have to be canceled, but you can select, if you’d like, to leave it canceled or restart the workflow from the beginning.

For any managed metadata service terms, if the term exists in the source but not in the destination, then you’ll get an exception in the job, so you need to make sure that all the terms are available. You can either migrate each term as you need them, or the term sets as you need them so there could be more terms than you need, or you could bring over everything. Alternatively, if you leave the ‘migrate managed metadata service’ box unchecked, you need to make sure that all the terms that are in the source are available in the destination.

‘Include empty lists and libraries’ is checked by default, but if you want to reduce any clutter as far as lists or libraries that are empty, which are more than likely not necessary or not being used, you can leave this unchecked to not include them in your migration. If you’re not sure, leave it checked.

Next, we’re going to look at the mapping options. It’s expected that you would use a discovery tool in the source so you know what’s there, and if there are a lot of templates that are in the source that won’t be recreated in the destination, we’ll need to know what templates you want to use for those custom templates, which can be accomplished in the template mapping settings.

If there are any columns or content types in the source that need to be changed to a different one in the destination, you can map those in the ‘content type mapping’ portion, but depending on what kind of column it is, it might not map appropriately. Just be careful with what you’re actually mapping over to make sure the columns make sense in relation to what you’re mapping to.

With ‘user mapping,’ the biggest issue we come across is when a user doesn’t exist in the environment anymore in your active directory, yet they created or modified a document. By default, if you have no user mapping, we’re just going to use the service account that DocAve is using, and you’ll see that instead if the user that no longer exists. If you don’t want to see that account, you can configure another account.

However, what most customers do, and what we recommend, is creating what’s known as a placeholder account, and configure it in user mapping so DocAve will actually use that account to upload the document. But, because it’s just a placeholder account, you’ll be able to edit the metadata to make it look like the non-existent user is actually there, so between the source and the destination with the placeholder account, it will look exactly the same, which is why most people use a placeholder. Another reason to use user mapping is if you’re actual user accounts are different between the source and the destination. For example, if the source is J. Smith instead of John.Smith, then that can be specified via user mapping. However, this may make for lots of users and because of this, you’re able to download and upload that mapping so you can take care of that outside of the GUI.

However, if the usernames are the same between the source and the destination, but the domains are different, then you’re able to use the ‘domain mapping’ in DocAve and we’re just going to replace the domain in the destination with the proper domain. If the domains are the same between the environments, you wouldn’t need to do any mapping at all, just the user mapping for the placeholder account so we know what to do with the deleted accounts.

Language mapping and list name mapping aren’t heavily used, but their uses are relatively self-explanatory.

Next, ‘advanced options.’ For your first option, ‘preserve the null column values,’ you need to determine how you want to handle the default value in a column. For example, if people were lazy in the source and weren’t actually classifying documents the way they should, you might not necessarily want that column of document classification to be empty in the destination.

You might want that to be a default classified term, for example an MMS term, so that you can easily find those documents and say “okay, all the ones that are unclassified, we need to classify these.” By just selecting ‘yes’ to dictate the preservation of null column values, it’s going to keep that blank null column value and it’ll look like there’s nothing there. By selecting ‘no,’ then it’s going to bring over the document and won’t leave the default value as an empty one. This is a very important setting and you should be aware of what your setting will mean.

The ‘change site look and feel’ setting is used primarily if, say, you’re migrating from 2007 to 2010 or 2010 to 2013 when the look and feel actually did change. You can change this in the migrator, or you can preserve it, which allows you — or the end user — to trigger that change in the future once you’re satisfied with your migration.

The ‘collapse folder structure’ setting is also heavily used because oftentimes customers find that their end users are not utilizing SharePoint properly, and they have a multitude of folders when they should be using metadata and views. In order to force end users to utilize these best practices, you can collapse the folder structure, but then if the folder structure was actually important to the document, you can still add a column, which will list exactly what the folder path was. If you need to specify the metadata, they can reference that column and then fill out metadata column fields based on that folder structure.

‘Character length settings’ are used only when you want to restrict the character limit to a maximum, otherwise you can leave them at the default maximum settings.

Lastly, ‘dynamic rule.’ If there is anything you’d like to do that you can’t necessarily do out of the box, you can set it here. If there’s any sort of mapping that is completely different from what you see in the mapping options, or if you need to use a filter that you don’t see in the filter policy, you may be able to take care of that as a custom dynamic rule that we would need to help you generate.

Sabrina is our Vice President of Customer Success, responsible for ensuring product adoption that drives customer success, engagement and retention.

View all posts by Sabrina Vazquez
Share this blog

Subscribe to our blog