I just learned the hard way that the Windows ODBC driver API requires an array of SQL_BINARY data, as input parameter, to be terminated with a zero-byte. Even though I didn't find such a statement in the documentation, I found this out by executing a stored procedure using this code:
Minimal Example
// Parameter binding
BYTE data[10] = { 15, 3, 54, 144, 34, 211, 200, 147, 15, 74 };
SQLBindParameter(hstmt, 1, SQL_PARAM_INPUT, SQL_C_BINARY, SQL_BINARY, 10, 0, data, 0, NULL);
// Procedure execution
SQLRETURN res = SQLExecDirect(hstmt, (SQLCHAR*)"{call dbo.Update_Data(?)}", SQL_NTS);
It caused SQLExecDirect to fail with SQL_NULL_DATA. Upon querying the Diagnostic Records using SQLGetDiagRec I've received the record:
SQL State = 22001, Error Msg: "[Microsoft][ODBC SQL Server Driver] String or binary data would be truncated"
While this usually indicates that data being inserted or updated into a column is larger than the column itself, this isn't the case here. After 4 hours of trying different parameters and statements I've finally found out the solution is as simple as terminating the byte array with a zero at the last position:
// Parameter binding
BYTE data[11] = { 15, 3, 54, 144, 34, 211, 200, 147, 15, 74, 0 }; // <- 0 termination here
SQLBindParameter(hstmt, 1, SQL_PARAM_INPUT, SQL_C_BINARY, SQL_BINARY, 10, 0, data, 0, NULL);
// Procedure execution
SQLRETURN res = SQLExecDirect(hstmt, (SQLCHAR*)"{call dbo.Update_Data(?)}", SQL_NTS);
Now I fail to understand why that is so? The function SQLBindParameter requires the length of the given data (10 as cbColDef or ColumnSize parameter) and still searches for a zero-byte?
To my understanding, zero termination is used where the length of an array is not determined by a length indicating variable but the termination of the array with a zero value. This is usually being done with strings. For binary data, that doesn't make much sense to me, as there could be intended zero bytes inside the array before the actual end (determined by a length indicator) is reached. I could possibly run into this issue, so it would be great if there was some way to avoid zero-terminating the byte array?
Full Example
As requested in the comments here is a full code dump of the unit test:
#include <windows.h>
#include <sql.h>
#include <sqlext.h>
// Settings
#define SIP "127.0.0.1"
#define SPort 1433
#define SUID "User"
#define SPW "PW"
#define SDB "world"
// Global ODBC mem
SQLHENV henv;
SQLHDBC hdbc;
SQLHSTMT hstmt;
// Logs Diagnostic records
void ProcessLogs(SQLSMALLINT plm_handle_type, SQLHANDLE &plm_handle);
// The query being tested
void TestQuery()
{
int col = 0;
SQLRETURN res = SQL_NTS;
// Params
ULONGLONG id = 44;
BYTE data[10] = { 15, 3, 54, 144, 34, 211, 200, 147, 15, 74 };
SQLBindParameter(hstmt, ++col, SQL_PARAM_INPUT, SQL_C_UBIGINT, SQL_BIGINT, 0, 0, &id, 0, NULL);
SQLBindParameter(hstmt, ++col, SQL_PARAM_INPUT, SQL_C_BINARY, SQL_BINARY, 10, 0, data, 0, NULL);
// Execution
res = SQLExecDirect(hstmt, (UCHAR*)"{call dbo.Update_Store_Data(?,?)}", SQL_NTS);
if (res != SQL_SUCCESS && res != SQL_SUCCESS_WITH_INFO)
{
printf("Error during query execution: %hd\n", res);
ProcessLogs(SQL_HANDLE_STMT, hstmt);
}
}
// ODBC Driver initialization
bool ODBCInit()
{
// Init ODBC Handles
RETCODE res;
res = SQLAllocHandle(SQL_HANDLE_ENV, NULL, &henv);
res = SQLSetEnvAttr(henv, SQL_ATTR_ODBC_VERSION, (SQLPOINTER)SQL_OV_ODBC3, SQL_IS_INTEGER);
res = SQLAllocHandle(SQL_HANDLE_DBC, henv, &hdbc);
// Connection string
char connStr[512];
sprintf_s(connStr
, sizeof(connStr)
, "DRIVER={SQL Server};SERVER=%s;ADDRESS=%s,%d;NETWORK=DBMSSOCN;UID=%s;PWD=%s;DATABASE=%s"
, SIP
, SIP
, SPort
, SUID
, SPW
, SDB);
// Connection
char outStr[512];
SQLSMALLINT pcb;
res = SQLDriverConnect(hdbc, NULL, (SQLCHAR*)connStr, strlen(connStr), (SQLCHAR*)outStr, ARRAYSIZE(outStr), &pcb, SQL_DRIVER_NOPROMPT);
if (res != SQL_SUCCESS && res != SQL_SUCCESS_WITH_INFO)
{
printf("Error during driver connection: %hd\n", res);
return false;
}
// Query handle
res = SQLAllocHandle(SQL_HANDLE_STMT, hdbc, &hstmt);
return true;
}
// ODBC Driver handles cleanup
void ODBCClean()
{
if (hstmt != SQL_NULL_HSTMT)
SQLFreeHandle(SQL_HANDLE_STMT, hstmt);
if (hstmt != SQL_NULL_HDBC)
SQLFreeHandle(SQL_HANDLE_DBC, hdbc);
if (hstmt != SQL_NULL_HENV)
SQLFreeHandle(SQL_HANDLE_ENV, henv);
}
int main(int argc, _TCHAR* argv[])
{
if (ODBCInit())
TestQuery();
ODBCClean();
return 0;
}
The SQL Table definition:
CREATE TABLE dbo.Store
(
UniqueNumber BIGINT IDENTITY(1,1) NOT NULL,
ItemID BIGINT NOT NULL,
AccountUniqueNumber BIGINT NOT NULL,
StorageType INT NOT NULL,
Count INT NOT NULL,
Data BINARY(10) NOT NULL
)
The called procedure:
CREATE PROCEDURE dbo.Update_Store_Data
@ID BIGINT,
@Data BINARY(10)
AS
BEGIN
UPDATE dbo.Store
SET Data = @Data
WHERE UniqueNumber = @ID
END
BufferLengthto the function, i.e.SQLBindParameter(..., data, 10, NULL);StrLen_or_IndPtr) instead ofNULLwhat did you get? It should return the length of binary data, right?