Showing posts with label SQLCMD. Show all posts
Showing posts with label SQLCMD. Show all posts

Quick Data Verification Tool

Recently, I had to put together a tool to verify data (in Microsoft SQL server database tables) in no time. Even though the time was very short, I had few requirements to follow while creating the tool as guidance. Those requirements were:
  1. Flexibility -  so that the tool is usable to more than one projects with little or no modifications.
  2. Lightweight - no use of third party scripting framework like Ant/Nant and no use of programming languages like Java , c# etc.
Since the tool was intended specifically for Microsoft SQL Server on Windows environments, I decided to use batch scripting, SQLCMD (http://technet.microsoft.com/en-us/library/ms162773.aspx), and SQL scripting.

In order to make it flexible and not to hard code anything (like Servername, database name, table name etc), so that same SQL script could be used from one project to another for similar type of verification with little or no modification, I decided to externalize these attributes in a ';' delimitted .csv file. Each line of my .csv file followed the following format:
<db-connection-info>;<db-name>.<table-name>;[<verification-criteria-expected-row-count>];[<other-criteria-1>];[<other-criteria-2>];[<column-name-1>];[<column-name-2>];[…]

Now, I had to put together a batch script file that could read each line as a row from .csv file, split it into columns and create and assign those values to corresponding variables into SQLCMD in order to pass to the SQL script during execution. 
I've used 'for' loop in the batch script to realize it. 'For' loop allows you to choose delimiter character, skipping any line(s) on the top of the file and what columns to read.  Here is a fragment of batch script file:
for /F "tokens=1-7 skip=1 delims=;" %%c in (%dataFile%) do (
  
sqlcmd -S %%c -i %iSqlFiles% -v tableName="%%d" -v xRowCnt=%%e  -v col1-cond1=%%f -v col2-cond1=%%g -v col1Name="%%h" -v col2Name="%%i"  >> %oLogFile%
)

In this example, it skips the first line of .csv file (passed as variable %dataFile%), reads column (token) 1 to 7 from each line delimited by ';' and assigns to variable starting from letter 'c'. And also it executes SQLCMD command with the given options and provided SQL file '%iSqlFile%'.

One catch here is that the output file. Even though, the SQLCMD has -o <output-file> option, I did not use it here because this option does not allow to append into the existing file. If you have multiple sqlcmd command executing in a loop they either create new file each time deleting the previous one or create multiple files given the output file are unique for each execution. The easy out way out is just to redirect using stream redirection symbol '>>'.


The final part was to create the re-usable (kind of template) SQL script file. Fragment below shows how variable passed from SQLCMD command can be used in SQL script and without hard-coding the database name, table name, column name etc.

SELECT @CurrRowCount=count (*) FROM $(tableName) where $(tableName).$(col1Name) NOT LIKE '$(col1-cond1)' and $(tableName).$(col2Name) NOT LIKE '$(col2-cond1)';


I guess, the idea is clear, below you can find more details of my .csv file, batch script file and SQL script files.

Below are the complete file, which you may be able to use:
  • .csv file: testcases.csv
#<db-connection-info>;<db-name>.<table-name>;[<verification-criteria-expected-row-count>];[<other-criteria-1>];[<other-criteria-2>];[<column-name-1>];[<column-name-2>];[…]
tcp:mysqlserver1\mysqlinstance1,20234;mydatabase1.mytable1;25;col-cond1;col-cond-2;mycolumn1;mycolumn2
.;mydatabase2.mytable1;0

;col-cond1;col-cond-2;mycolumn1;mycolumn2
 mysqlserver1;mydatabase3.mytable1;10;
;col-cond1;col-cond-2;mycolumn1;mycolumn2
 ...


  • Batch script file: DataVerification.cmd

SETLOCAL
@echo off
REM ==============================================
REM DataVerification.cmd - executes a given SQL script containing logic.
REM
==============================================
  cls
set currDir=%~dp0
set dataFile=%currDir%\testcases.csv
REM SQLCMD can read (if required) multiple input sql files delimited by comma.
set iSqlFiles=%currDir%\RowCnt.sql
set oLogFile=%currDir%\DVerification.log
echo ============================================== > %oLogFile%
echo ============= Verification Details =========== >> %oLogFile%
echo ============================================== >> %oLogFile%
REM The following logic within for loop can be customized as per project requirement.
REM Currently it skips the first line in the csv file, and reads each line comma delimited line as row and column.
for /F "tokens=1-5 skip=1 delims=," %%c in (%dataFile%) do (
   echo ============================================== >> %oLogFile%
   echo Processing Database:Table: %%d >> %oLogFile%
   sqlcmd -S %%c -i %iSqlFiles% -v tableName="%%d" -v xRowCnt=%%e  -v col1-cond1=%%f -v col2-cond1=%%g -v col1Name="%%h" -v col2Name="%%i"  >> %oLogFile%
   echo
============================================== >> %oLogFile%
)
find "verification: failed" %oLogFile% > null
if "%ERRORLEVEL%" EQU "0" (
    echo Combined Verification FAILED. Please see the %oLogFile% for details.
) else (
    echo Combined Verification PASSED. Please see the %oLogFile% for details.
)


  • SQL file: RowCnt.sql
PRINT N'Generated as of: '+RTRIM(CAST(GETDATE() AS nvarchar(30)));
Declare @CurrRowCount int;
where $(tableName).$(col1Name) NOT LIKE '$(
col1-cond1)' and $(tableName).$(col2Name) NOT LIKE '$(col2-cond1)'
PRINT N'Total row count: '+CAST(@CurrRowCount AS nvarchar(5))+N'.'
if (@CurrRowCount > $(xRowCnt)) PRINT N'verification: passed.';
else PRINT N'verification: failed.';


Finally, when all test cases defined in testcases.csv are executed, you'll get either "Passed" (if all test cases pass) or "Failed" (even if a single test case fails).  You can see the details in the thus generated log file, similar to following ...

==============================================
============= Verification Details ===========
==============================================

==============================================
Table: mydatabase1.mytable1
Generated as of: <timestamp>
Total row count: xx.
verification: passed[failed].

============================================== ==============================================
Table: mydatabase2.mytable1
Generated as of: <timestamp>
Total row count: yy.
verification: passed[failed].
==============================================

Obviously, this is just an idea, you can develop your won SQL file(s) as per your data processing requirement, modify .csv file as your input requirement and change the 'for' loop as change happens in .csv file.