Welcome to the Admin vs Developer mini series hosted by yours truly – Simon Lawrence and Jenny Bamber.

Simon is a well certified Senior Developer and Salesforce.com MVP, who has been programming since he could move his fingers and has 6 years of commercial experience coding Salesforce solutions. He is generally called upon to design and construct the code based Salesforce solutions Desynit need.

As you may know, Jenny is a certified Salesforce Administrator with 4 years of experience consulting, designing and implementing complex Salesforce projects at Desynit. She specialises in leveraging the declarative features of the Sales and Service Clouds.

Coming up in this series –

We are going to bring our creative and technical differences together on one topic each week – discussing what we think of a problem, how we might solve it from our Admin / Dev perspective. This will then be followed by the Admin vs Dev challenge of the week… Let battle commence.

This Week’s Topic – Salesforce Data Loading

From hair pulling complex data migrations to relatively simple record updates, we have all had our shares of ups and downs when it comes to Data loading.

Here are some of the questions that we frequently come across – How do you know what tools to use when importing and exporting data? And what triumphs can you achieve and problems can you can dodge along the way?

Simon: There are a number of great external dataloading solutions out there these days, and I generally go off platform when looking to do this. If your data is already in shape and ready to load, the Desktop Dataloader (downloadable from the Setup menu) will comfortably pump a few million records into your Org, and neatly leverages an OAuth2.0 login, so no daft security token stuff any more! If you need to do anything more complicated like re-parent child records to new records in the destination or transform the data between the source files and target org you will probably be looking at a more complex tool like Talend (community edition is free) or CRMFusion’s DemandTools – which comes as part of a wider (paid for) suite of data tools.

Jenny: Hands down, my go-to-tool is the free version Dataloader.io. Not only is it super user friendly, but also super powerful when it comes to uploading data. Granted there are limits, however the free version is perfect when it comes to importing, exporting and deleting data. So when do you use this tool? If you are dealing with less than 10,000 records per month and less than 1000 records per task (however there is the option to use the Bulk API). If you have more than that then you can pay to upgrade Dataloader.io or you can take advantage of the Data Import Wizard (accessible via setup). Anything bigger then see Simon’s comments above.

Simon: Whilst most good Extract Transform Load/Dataloading tools will provide a Bulk API option in the interface, which doesn’t eat up all your daily API calls; you might not be able to guarantee that all your customisations and automations are bulk friendly. Before loading lots of data (or committing to a timeline in which you will do so!) it’s worth checking out – maybe via a data-sandbox trial run – that you are not going to come up against Flows that explode when they are hit with more than 50 new records, or a trigger that (coincidentally) throws a SOQL Exception on the 101st insert.

For huge dataloads (into the 100,000s of records) – if you know for a fact that the data already meets data quality standards; you should consider turning off Validation rules and Workflows that you do not need to fire, as they can actually introduce significant computational overhead. I once worked on a data load where 450,000 records were loading in at an average of about 100 a minute; far too slow for our needs! After analysing and disabling a couple of triggers and workflows, we had the whole lot uploaded in a matter of minutes. I also think some of those workflows – after the analysis – were never turned back on!

Jenny: Dupes dupes dupes. You don’t want them in your org, so why put them in your import doc?! Dataloader.io along with some other ‘Admin’ tools do not detect duplicate data so to avoid creating a heap load of trouble in your live org either clean down your data beforehand or get your client to do it during the preparation stage of their project. The last thing you want to deal with is pesky data errors, especially if you are up against a deadline. If you do get errors however, you can rest assured that they are very descriptive so resolving the data issues are normally pretty straightforward. Assuming you are using dataloader.io that is.

Once the data quality standards are met, best start prepping your data upload .csv. One of the most frequent problems I often come across is issues with mapping fields. Some tips when it comes to uploading your data spreadsheet, make sure all your columns have titles as blank headers will prohibit you from uploading data. And if you are uploading accounts / contacts along with address lines make sure you separate the address lines into their own columns (Billing Street, Billing County, Billing Postcode etc). Admin 101 for data uploading I know, but you’d be surprised at how often it catches people out. Get the above right and you’ll be well on your way to trouble free data uploading.

Ok so now you know what tools to use and how to avoid those ever so popular issues.

This Week’s Challenge – Marshmallows

So Simon and I make a lot of noise in the Salesforce Community, but who really has the biggest gob? Let the marshmallows decide!

Fast forward to Week 2 to find out who won! And don’t forget to tweet us at @jenny_bamber @srlawr with encouragement.

See you next week.

Jenny Bamber September 20, 2017

3 thoughts on “Salesforce Data Loading and Marshmallows – Admin vs Developer Challenge 1

  • Great article and thanks for sharing. When data loading, I always recommend backing up with record Ids included for any data that is being updated. That way if something goes wrong in the load there is an easy rollback strategy. Also a fan of dataloader.io (and workbench) for data operations.

    • You are right, Anthony. A key point we both missed in this post. Human error happens, so to save your ego and business flow from taking a blow, always backup and have a rollback strategy in place! *Thumbs Up*

Leave a Reply

Your email address will not be published. Required fields are marked *