In Lazarus, I am loading libmariadb.so.3 v3.3.10 via LoadLibrary(). (Did another test with libmysqlclient.so.21 v8.0.41). When iterating through the fields of a query result, the length of a value is indicated by the items of the array returned from mysql_fetch_lengths():
type
PMYSQL_LENGTHS = ^TMYSQL_LENGTHS;
TMYSQL_LENGTHS = array[0..4095] of LongWord;
mysql_fetch_lengths: function(Result: PMYSQL_RES): PMYSQL_LENGTHS; cdecl;
var
LengthsPointer: PMYSQL_LENGTHS;
begin
LengthsPointer := mysql_fetch_lengths(FCurrentResults);
for i := 0 to NumFieldsInResult - 1 do
ShowMessage(IntToStr(LengthsPointer^[i]));
That worked for years nicely, in Delphi and also in Lazarus with a Windows executable. I am using the same libmysql and libmariadb libraries on Windows as on Linux. Now when I compile the same code in Lazarus for Linux and run it, the TMYSQL_LENGTHS seems to contain Int64 items (with a size of 8 bytes), not LongWord/UInt32 (with a size of 4 bytes). When keeping the LongWord definition, the array loop shows twice as many items as columns exist in the table, with a 0 at each odd index, e.g.:
fieldno LengthsPointer[fieldno]
0 2
1 0
2 10
3 0
4 3
5 0
6 1024
7 0
The documentation clearly states these should be unsigned long integers, even for the newest versions. So I assume I am doing something wrong. Do I?
longmay have a default size of 32 bit when compiling for 32-bit and 64 bit for 64-bit platforms. Platforms, not operating systems. Don't mix different platforms, and don't mix vendors.