Start Collecting Your Dealership’s Data So You Can Understand & Improve Your Store’s Results – Reposted

This is a re-post of my post on ADP (CDK Global) Expertise today.

What I am about to explain to you will change your life if you are at all frustrated with your store’s performance?

It may sound strange but usually performance issues arise from a lack of feedback to the workgroups contained within your operations. I use the term workgroups because there are numerous groups of employees in your dealership that need feedback to perform optimally. These groups of employees (workgroups) include:

  • An Individual
  • A Team – (usually front end sales teams but this can include techs and other employees).
  • A Department
  • A Store
  • Same stores – a workgroup important for comparing Year over Year (YoY) results.
  • A Division – ( a group of stores by franchise or geographical location)
  • A Region – ( usually geographical)
  • A Country (if you have stores in different countries)

How would you like to improve the profitability of all your store or stores right now?

We will need to use your data from your DMS (Dealership Management System) extracted properly for analysis and presented in a way that ranks your workgroups. The following SharePoint Online interactive report has a tab that ranks the performance of two workgroups; stores and individual F&I Managers:


The premier business data presentation platform, SharePoint Online, is free with the Enterprise Office 365 subscriptions. With SharePoint Online you can control who is able to see each individual report based on their logon. Those who don’t have permission will not even see the report link on the site.

Feedback is extremely important if you want your employees to improve. If your employees don’t get feedback they will either:

  • Get the data on their own or ask for it or,
  • Just maintain the status quo (under-performing) or,
  • In all cases, leave your employ.

The sad part is, the good ones leave (because they are not recognized) along with the not-so-good (because they are not motivated).

We need to take advantage of a person’s or workgroup’s natural competitive instincts!

Critical to reaping the performance improvement that feedback can create is ranking the workgroups. Feedback can also be fine using objectives but that doesn’t get the competitive juices flowing. We have to rank them!

In my experience, when you rank workgroups they all rise, even the bottom individual or workgroup. That’s because everyone wants (on some level) to do good.

There are three steps to making this happen:

  1. Automate the Collection of the Data.
  2. Train your CFO and Controllers on how to use the latest tools to analyze the data efficiently
  3. Train your CFO and Controllers on how to create reports with the latest tools and publish them to SharePoint and Office365.

I have a lot of experience with CDK, InfinitiNet and DealerTrack and so that is what I will stick with here. You can access your data on these platforms but I am not so sure about Reynolds anymore.

Step 1:

To automate the collection of your data all you really need is an ODBC connection, some knowledge of the DMS data model and a subscription to Microsoft Azure. You would then ask someone like me to write a PowerShell script to run on your server to extract the data from the DMS at least daily and move it to a SQL Azure database (or on-premise).

You can skip this step if you are on CDK and you have signed up for their wonderful MDA (Managed Data Access) or DDA (Dealer Data Access) product. Those products serve up daily refreshes of data on a PostgreSQL database on your CDK server already.

Step 2:

CFOs and Controllers and even advanced Business Managers are the ones to analyze the data and create the reports. They need training and I will be setting up some courses here in Las Vegas to train them. I don’t advocate having IT people do this because they just don’t have the knowledge of how these numbers are generated; where they come from and how they inter-relate. So they need training in:

  • The DMS data model i.e., where the data resides and what transactions put it there.
  • How to use Excel 2013 and 2016 with PowerPivot and PowerQuery and other tools to create interactive reports so they refresh automatically when the underlying data is updated. You don’t want a set of gorgeous reports that require your people to manually update them. We will let SharePoint Online handle that.

Step 3:

Because this data is confidential and because we need to make sure it is correct this step also falls on the responsibility of the financial side of the business. I recommend this be handled by the CFO, Controller and/or Business Manager. In larger dealer groups the CFO or Controller can hire someone with this expertise but they need to understand car dealership transactions and accounting. There are two kinds of reporting in the dealership ( I teach all the competencies needed for both):

  1. Permanent periodic reports. These include rankings, Over-Age Inventory, Heat sheets, A/R Aging, CIT, Incentives, Upcoming Lease Terminations Reporting etc. These reports need to be built once and automated by reporting in their own SharePoint page with automatic refresh with proper permissions.
  2. Ad-Hoc reports. These are reports that you need to create one time or on-the-fly to do analytics prior to making a major business decision. In either case properly trained financial people need to prepare them.

I will be sending out an email announcing a seminar in the spring to lay this out for Dealers, CFOs and Controllers in detail. If you want to be on the list you can email me at jdonnelly at jdonnelly dot com or give me a call at 702.498.8777

Getting CSV File Data into SQL Azure

I have been using a trial to see if SQL Azure might be a good place to store regular data extractions for some of my auto dealership clients. I have always wanted to develop using SQL Server because of the (it seems) universal connectivity ability. Nearly all the web development frameworks I use make connection to SQL Server.

So I decided to give it a try and after building a vehiclesalescurrent database as follows:

I then researched the proper way to load the table. BULK INSERT is not supported in SQL Azure; only the Bulk Copy Program (bcp Utility) is available to import csv files. The bcp Utiltiy, being a command line program is not very intuitive. It is rather powerful though as it can do import (in), export (out) and create format files. Whatever possible switch you can think of, it has. I forged ahead and tried to do the import without a format file (which is possible apparently). The Format File is an XML file that tells bcp how the csv file maps to the table in the database.  I received error after error mostly relating to cast issues and  invalid end of record. I was under the impression that a csv file had a rather common end of record code known as CRLF or carraige return/line feed. I opened my csv file in notepad++ with the view codes option on to make sure there wasn’t anything unusual going on. There wasn’t.

Having no success the easy way I decided to create a Format file which would tell bcp what to look for definitively as an end of record. The bcp utility will accept either an XML or non-XML Format file and bcp utility will create either for you. I chose the XML format file because I just felt it might be less buggy. This spit out a file easily but I still had to make a few modifications to the resulting XML file. In particular, bcp got the column separator wrong (“\t” for tab) and I changed it to a comma (“\,” ) as the file was csv. Also the last column of data, in my case column 21 needed the terminator “\r\n” which is the offending return and newline (CRLF) codes! Make sure the slashes are the right way; for some reason (I think I saw it in a blog post!) I put forward slashes and I needed the help desk to straighten me out. Anyway here is the bcp command to create an XML format file:
bcp MyDB.dbo.VehicleSalesCurrent format nul -c -x -f C:\JohnDonnelly\VehicleSalesCurrent.xml -U johndonnelly@xxxxxxxx -S -P mypassword

And here is the final correct Format file that you would  include in the same directory as the vehiclesalescurrent.txt data file when running the bcp utility to upload the data to SQL Azure:

<?xml version=”1.0″?>

<BCPFORMAT xmlns=”; xmlns:xsi=””&gt;


<FIELD ID=”1″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”12″/>

<FIELD ID=”2″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”24″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”3″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”24″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”4″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”24″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”5″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”16″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”6″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”11″/>

<FIELD ID=”7″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”16″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”8″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”100″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”9″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”100″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”10″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”100″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”11″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”12″/>

<FIELD ID=”12″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”11″/>

<FIELD ID=”13″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”12″/>

<FIELD ID=”14″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”30″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”15″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”100″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”16″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”100″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”17″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”40″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”18″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”12″/>

<FIELD ID=”19″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”34″ COLLATION=”SQL_Latin1_General_CP1_CI_AS”/>

<FIELD ID=”20″ xsi:type=”CharTerm” TERMINATOR=”,” MAX_LENGTH=”30″/>

<FIELD ID=”21″ xsi:type=”CharTerm” TERMINATOR=”\r\n” MAX_LENGTH=”12″/>



<COLUMN SOURCE=”1″ NAME=”ID” xsi:type=”SQLINT”/>

<COLUMN SOURCE=”2″ NAME=”StockNo” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”3″ NAME=”DealType” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”4″ NAME=”SaleType” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”5″ NAME=”Branch” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”6″ NAME=”SalesDate” xsi:type=”SQLDATE”/>



<COLUMN SOURCE=”9″ NAME=”Address” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”10″ NAME=”City” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”11″ NAME=”Zip” xsi:type=”SQLINT”/>

<COLUMN SOURCE=”12″ NAME=”BirthDate” xsi:type=”SQLDATE”/>

<COLUMN SOURCE=”13″ NAME=”Year” xsi:type=”SQLINT”/>

<COLUMN SOURCE=”14″ NAME=”Make” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”15″ NAME=”Model” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”16″ NAME=”Body” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”17″ NAME=”Color” xsi:type=”SQLNVARCHAR”/>

<COLUMN SOURCE=”18″ NAME=”Mileage” xsi:type=”SQLINT”/>


<COLUMN SOURCE=”20″ NAME=”Down” xsi:type=”SQLMONEY”/>

<COLUMN SOURCE=”21″ NAME=”Term” xsi:type=”SQLINT”/>



As you can see there is a section that defines what the bcp utility should expect regarding each of the fields and a section that defines what column name goes where in the receiving table.

As I mentioned above the Field Terminators are important (of course) and I found that bcp had them incorrect (“\t”) for a CSV file. Also it had the field terminator for the last field the same as all of the others fields (which was “\t”). This needs to be set to the CR LF codes. Also as I mentioned above the terminator on my last column (21) needed to have back slashes, and I think my google search yielded bad advice and I had forward slashes so:

<FIELD ID=”21″ xsi:type=”CharTerm” TERMINATOR=”/r/n” MAX_LENGTH=”12″/>

Obviously it should be:

<FIELD ID=”21″ xsi:type=”CharTerm” TERMINATOR=”\r\n” MAX_LENGTH=”12″/>

With the good Format FIle I ran the following bcp utility statement to handle the import:
# This one works, starts importing at Row 2, make sure dates are YYYY-MM-DD
bcp MyDB.dbo.VehicleSalesCurrent in C:\JohnDonnelly\VehicleSalesCurrent.csv -F 2 -f C:\JohnDonnelly\VehicleSalesCurrent.xml -U johndonnelly@xxxxxxx -S -P mypassword -e C:\JohnDonnelly\err.txt

The -e switch throws out a nice err.txt file that gives you much more information about errors than the console does. For complete switch explanation follow the bcp Utility link above.

With the above Format File things got a little better however I had to open a case because I kept getting an “Invalid character value for cast specification” error which cited a SQLDate column type.

I learned from the help desk that the SQLDATE columns in my csv file needed to look like yyyy-mm-dd so I had to use a custom format each time if I opened the csv in Excel. You have to reformat the date columns as yyyy-mm-dd then save the file. That is the quick way anyway.

The support person said there was a bug in bcp relating to dates which leads to the last item and that is out of 9,665 rows there were 34 rows that wouldn’t move due to this error:

“Col 12 – Invalid character value for cast specification”

Column 12 was a birth date of SQLDATE type.

Furthermore 32 of the 34 rows were null in Column 12 so I agree with support that bcp Utility or SQL Azure is buggy with regards to SQLDATE.

I hope this helps someone battling the bcp utility with SQL Azure!