It is not a bug, however the documenation is missing an important item. When an array is expanded, or its values concatenated or calculated, the number of used array items is limited to the value of the LIMIT option whose default value is 10 (as for the XML table type row expanding)
I recognize that this seems too drastic in your example because the expansion here represents the total number of rows of the table. It was done initially to restrict the size of, for instance, the concatenated names of authors of a book. I shall see if this is to be suppressed or modulated differently.
Your problem could be fixed by adding to the create table OPTION_LIST='Limit=n' n being greater than the size of your table.
However, the true fix is here to define the whole table as the array (instead of expanding it):
create table json (id int, value char(1)) engine=connect, table_type=json, file_name='test.json', option_list='object=items';
Here the limit does not apply.
It is not a bug, however the documenation is missing an important item. When an array is expanded, or its values concatenated or calculated, the number of used array items is limited to the value of the LIMIT option whose default value is 10 (as for the XML table type row expanding)
I recognize that this seems too drastic in your example because the expansion here represents the total number of rows of the table. It was done initially to restrict the size of, for instance, the concatenated names of authors of a book. I shall see if this is to be suppressed or modulated differently.
Your problem could be fixed by adding to the create table OPTION_LIST='Limit=n' n being greater than the size of your table.
However, the true fix is here to define the whole table as the array (instead of expanding it):
Here the limit does not apply.