Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

HDF5 low-level write: extend and append array

233 views
Skip to first unread message

Ken

unread,
Jun 10, 2010, 6:15:22 PM6/10/10
to
I am trying to write single precision 3D array (M x N x K) data to an HDF5 file and need to append the data as a series of new arrays onto an existing dataset within the HDF5 file. From what I have learned so far, this is accomplished by extending the dataset dimensions, selecting the hyperslab, and writing the data to the new dimensions.

I have two datasets each with dimensions of 48 x 100 x 2. My initial HDF5 file writes properly and i have the first (48x100x2) dataset correct.

I am running into two problems:

1. My newly appended data do not appear to be matching the matlab dimensions. As I scroll through the arrays in the dataset within the HDF5 file I have the values that should be constant (value = 2.0) moving upward along the dimension with length=100.

2. I have not been able to figure out a way to append a new (48x100x2) dataset without first combining it to a dummy (48x100x2) dataset. In other words, i need a (96x100x2) dataset in order to append an additional 48 arrays to a (48x100x2) dataset.

Here is my code:


%----------------------------------------------------------------------------------------
% Initialize Data
%----------------------------------------------------------------------------------------

testdata = single(ones(48,100,2));
testdata2=testdata.*2;

data_initialize =zeros(size(testdata2));
data_combined=[ data_initialize; testdata2];

filename = 'test3dim.h5'
dsetname = 'my_dataset'

dims(1) = 48;
dims(2) = 100;
dims(3) = 2;

newdims(1) = 96;
newdims(2) = 100;
newdims(3) = 2;

chunk(1) = 48;
chunk(2) = 100;
chunk(3) = 2;

%----------------------------------------------------------------------------------------
% Create Initial HDF5 File
%----------------------------------------------------------------------------------------

%
% Create a new file using the default properties.
%
fileID = H5F.create(filename, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', 'H5P_DEFAULT');
%
% Create dataspace with unlimited dimensions.
%
maxdims = {'H5S_UNLIMITED', 'H5S_UNLIMITED', 'H5S_UNLIMITED'};
space = H5S.create_simple (3, dims, maxdims);
%
% Create the dataset creation property list, add the gzip
% compression filter and set the chunk size.
%
dcpl = H5P.create('H5P_DATASET_CREATE');
H5P.set_deflate(dcpl, 9);
H5P.set_chunk(dcpl, chunk);
%
% Create the compressed unlimited dataset.
%
datasetID = H5D.create(fileID, dsetname, 'H5T_NATIVE_FLOAT', space, dcpl);
%
% Write the data to the dataset.
%
datatypeID = H5T.copy('H5T_NATIVE_FLOAT');
H5D.write(datasetID, datatypeID,'H5S_ALL', 'H5S_ALL','H5P_DEFAULT', testdata);

%
% Close and release resources.
%
H5P.close(dcpl);
H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);

% =================New HDF5 File Created ===================


%----------------------------------------------------------------------------------------
% Open Existing HDF5 File and Append Data to Dataset
%----------------------------------------------------------------------------------------

% Open Existing HDF5 File
fileID = H5F.open(filename, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
% Open Existing Dataset
datasetID = H5D.open(fileID, dsetname);

% Extend Existing Dataset Dimensions
H5D.extend(datasetID, newdims);
space = H5D.get_space(datasetID);

% Setup Hyperslab
start = [0,0,0];
count= [48,100,2];
stride = [1,1,1];
block = [];
H5S.select_hyperslab(space, 'H5S_SELECT_NOTB', start, stride, count, block);

% Write Data to newly extended dimensions
H5D.write(datasetID, 'H5T_NATIVE_FLOAT', 'H5S_ALL', space,'H5P_DEFAULT', data_combined);

H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);
% ===============New HDF5 File Appended ===================

Any assistance in providing a solution to the two problems stated at the top of the message would be greatly appreciated.

Many Thanks!

Dinesh Iyer

unread,
Jun 11, 2010, 11:31:06 AM6/11/10
to
Hello Ken,
The issue you are facing is due to the difference in memory ordering between C and MATLAB. The HDF5 library uses C-style ordering for multidimensional arrays, while MATLAB uses FORTRAN-style ordering.

For more information about this, please refer to pt 4 in the README file at the link below:
http://www.hdfgroup.org/ftp/HDF5/examples/examples-by-api/api18-m.html

I am providing you with the modified code:

%----------------------------------------------------------------------------------------
% Initialize Data
%----------------------------------------------------------------------------------------

%%
testdata = single(ones(48,100,2));
testdata2=testdata.*2;

data_initialize =zeros(size(testdata2));
data_combined=[ data_initialize; testdata2];

filename = 'test3dim_mod.h5';
dsetname = 'my_dataset';

dims(1) = 48;
dims(2) = 100;
dims(3) = 2;

newdims(1) = 96;
newdims(2) = 100;
newdims(3) = 2;

chunk(1) = 48;
chunk(2) = 100;
chunk(3) = 2;

%----------------------------------------------------------------------------------------
% Create Initial HDF5 File
%----------------------------------------------------------------------------------------

%
% Create a new file using the default properties.
%
fileID = H5F.create(filename, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', 'H5P_DEFAULT');
%
% Create dataspace with unlimited dimensions.
%
maxdims = {'H5S_UNLIMITED', 'H5S_UNLIMITED', 'H5S_UNLIMITED'};

h_dims = fliplr(dims);
h_maxdims = fliplr(maxdims);
space = H5S.create_simple (3, h_dims, h_maxdims);


%
% Create the dataset creation property list, add the gzip
% compression filter and set the chunk size.
%
dcpl = H5P.create('H5P_DATASET_CREATE');
H5P.set_deflate(dcpl, 9);

h5_chunk = fliplr(chunk);
H5P.set_chunk(dcpl, h5_chunk);


%
% Create the compressed unlimited dataset.
%
datasetID = H5D.create(fileID, dsetname, 'H5T_NATIVE_FLOAT', space, dcpl);
%
% Write the data to the dataset.
%
datatypeID = H5T.copy('H5T_NATIVE_FLOAT');
H5D.write(datasetID, datatypeID,'H5S_ALL', 'H5S_ALL','H5P_DEFAULT', testdata);

%
% Close and release resources.
%
H5P.close(dcpl);
H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);

% =================New HDF5 File Created ===================

%%


%----------------------------------------------------------------------------------------
% Open Existing HDF5 File and Append Data to Dataset
%----------------------------------------------------------------------------------------

% Open Existing HDF5 File
fileID = H5F.open(filename, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
% Open Existing Dataset
datasetID = H5D.open(fileID, dsetname);

% Extend Existing Dataset Dimensions

h5_newdims = fliplr(newdims);
H5D.extend(datasetID, h5_newdims);
space = H5D.get_space(datasetID);

% Setup Hyperslab
start = [0 0 0]; h5_start = fliplr(start);
stride = [1 1 1]; h5_stride = fliplr(stride);
count = [1 1 1]; h5_count = fliplr(count);
block = [48 100 2]; h5_block = fliplr(block);
H5S.select_hyperslab(space, 'H5S_SELECT_NOTB', h5_start, h5_stride, h5_count, h5_block);

Ken

unread,
Jun 15, 2010, 12:47:04 AM6/15/10
to
Dinesh,

Many thanks for your post. It was quite helpful.

The solution you presented resulted in clean results, but the dimensions of the dataset were not what I was attempting to achieve. However, after studying your solution, it was able to help me resolve my primary problem. Your post helped me focus on the dimensions and the stride, count, start, block settings.

I ended up reverting back to my original script, but with two modifications:

1. I added a line to rearrange the data to be written:
data_combined=permute(data_combined,[3 2 1]);

2. I modified the hyperslab setup similar to your script:


start = [0 0 0];

stride = [1 1 1];

count= [1 1 1];


block = [48 100 2];


These modifications resulted in the dataset that I was attempting to obtain with two columns, 100 rows, and 96 pages (arrays) long. Here is the modified code:

%----------------------------------------------------------------------------------------
% Initialize Data
%----------------------------------------------------------------------------------------

testdata = single(ones(48,100,2));
testdata2=testdata.*2;

data_initialize =zeros(size(testdata2));
data_combined=[ data_initialize; testdata2];

data_combined=permute(data_combined,[3 2 1]);

filename = 'test3dim10.h5'
dsetname = 'my_dataset'

dims(1) = 48;
dims(2) = 100;
dims(3) = 2;

newdims(1) = 96;
newdims(2) = 100;
newdims(3) = 2;

chunk(1) = 48;
chunk(2) = 100;
chunk(3) = 2;

%----------------------------------------------------------------------------------------
% Create Initial HDF5 File
%----------------------------------------------------------------------------------------


%
% Create a new file using the default properties.
%
fileID = H5F.create(filename, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', 'H5P_DEFAULT');
%
% Create dataspace with unlimited dimensions.
%
maxdims = {'H5S_UNLIMITED', 'H5S_UNLIMITED', 'H5S_UNLIMITED'};

space = H5S.create_simple (3, dims, maxdims);

%
% Create the dataset creation property list, add the gzip
% compression filter and set the chunk size.
%
dcpl = H5P.create('H5P_DATASET_CREATE');
H5P.set_deflate(dcpl, 9);

H5P.set_chunk(dcpl, chunk);

%
% Create the compressed unlimited dataset.
%
datasetID = H5D.create(fileID, dsetname, 'H5T_NATIVE_FLOAT', space, dcpl);
%
% Write the data to the dataset.
%
datatypeID = H5T.copy('H5T_NATIVE_FLOAT');
H5D.write(datasetID, datatypeID,'H5S_ALL', 'H5S_ALL','H5P_DEFAULT', testdata);

%
% Close and release resources.
%
H5P.close(dcpl);
H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);

% ====================New HDF5 File Created ================


%----------------------------------------------------------------------------------------
% Open Existing HDF5 File and Append Data to Dataset
%----------------------------------------------------------------------------------------

% Open Existing HDF5 File
fileID = H5F.open(filename, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
% Open Existing Dataset
datasetID = H5D.open(fileID, dsetname);

% Extend Existing Dataset Dimensions

H5D.extend(datasetID, newdims);
space = H5D.get_space(datasetID);

% Setup Hyperslab
start = [0 0 0];

stride = [1 1 1];

count= [1 1 1];


block = [48 100 2];

H5S.select_hyperslab(space, 'H5S_SELECT_NOTB', start, stride, count, block);

% Write Data to newly extended dimensions


H5D.write(datasetID, 'H5T_NATIVE_FLOAT', 'H5S_ALL', space,'H5P_DEFAULT', data_combined);

H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);
% ==========New HDF5 File Appended =========================

The next question I have is:

Is there any way to append the second set of 48 arrays (or pages) to the dataset without including a set of 48 "dummy" arrays prior to the new 48 as filler? I'd like to append an additional 48 arrays without first making that dimension 96 long.

I tried various attempts at changing the "start" value to different values (for example start = [48 0 0]; and other attempts), but I could not get the data to write.

Would I need to use H5S.offset_simple for this?

Thanks again!

Ken

unread,
Jun 15, 2010, 12:50:20 AM6/15/10
to
Dinesh,

Many thanks for your post. It was quite helpful.

The solution you presented resulted in clean results, but the dimensions of the dataset were not what I was attempting to achieve. However, after studying your solution, it was able to help me resolve my primary problem. Your post helped me focus on the dimensions and the stride, count, start, block settings.

I ended up reverting back to my original script, but with two modifications:

1. I added a line to rearrange the data to be written:
data_combined=permute(data_combined,[3 2 1]);

2. I modified the hyperslab setup similar to your script:

start = [0 0 0];

stride = [1 1 1];

count= [1 1 1];


block = [48 100 2];


These modifications resulted in the dataset that I was attempting to obtain with two columns, 100 rows, and 96 pages (arrays) long. Here is the modified code:

%----------------------------------------------------------------------------------------
% Initialize Data
%----------------------------------------------------------------------------------------

testdata = single(ones(48,100,2));
testdata2=testdata.*2;

data_initialize =zeros(size(testdata2));
data_combined=[ data_initialize; testdata2];

data_combined=permute(data_combined,[3 2 1]);

filename = 'test3dim10.h5'
dsetname = 'my_dataset'

dims(1) = 48;
dims(2) = 100;
dims(3) = 2;

newdims(1) = 96;
newdims(2) = 100;
newdims(3) = 2;

chunk(1) = 48;
chunk(2) = 100;
chunk(3) = 2;

%----------------------------------------------------------------------------------------
% Create Initial HDF5 File
%----------------------------------------------------------------------------------------


%
% Create a new file using the default properties.
%
fileID = H5F.create(filename, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', 'H5P_DEFAULT');
%
% Create dataspace with unlimited dimensions.
%
maxdims = {'H5S_UNLIMITED', 'H5S_UNLIMITED', 'H5S_UNLIMITED'};

space = H5S.create_simple (3, dims, maxdims);

%
% Create the dataset creation property list, add the gzip
% compression filter and set the chunk size.
%
dcpl = H5P.create('H5P_DATASET_CREATE');
H5P.set_deflate(dcpl, 9);

H5P.set_chunk(dcpl, chunk);

%
% Create the compressed unlimited dataset.
%
datasetID = H5D.create(fileID, dsetname, 'H5T_NATIVE_FLOAT', space, dcpl);
%
% Write the data to the dataset.
%
datatypeID = H5T.copy('H5T_NATIVE_FLOAT');
H5D.write(datasetID, datatypeID,'H5S_ALL', 'H5S_ALL','H5P_DEFAULT', testdata);

%
% Close and release resources.
%
H5P.close(dcpl);
H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);

% ====================New HDF5 File Created ================


%----------------------------------------------------------------------------------------
% Open Existing HDF5 File and Append Data to Dataset
%----------------------------------------------------------------------------------------

% Open Existing HDF5 File
fileID = H5F.open(filename, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
% Open Existing Dataset
datasetID = H5D.open(fileID, dsetname);

% Extend Existing Dataset Dimensions

H5D.extend(datasetID, newdims);
space = H5D.get_space(datasetID);

% Setup Hyperslab
start = [0 0 0];

stride = [1 1 1];

count= [1 1 1];


block = [48 100 2];

H5S.select_hyperslab(space, 'H5S_SELECT_NOTB', start, stride, count, block);

% Write Data to newly extended dimensions


H5D.write(datasetID, 'H5T_NATIVE_FLOAT', 'H5S_ALL', space,'H5P_DEFAULT', data_combined);

H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);

Ken

unread,
Jun 15, 2010, 12:51:03 AM6/15/10
to
Dinesh,

Many thanks for your post. It was quite helpful.

The solution you presented resulted in clean results, but the dimensions of the dataset were not what I was attempting to achieve. However, after studying your solution, it was able to help me resolve my primary problem. Your post helped me focus on the dimensions and the stride, count, start, block settings.

I ended up reverting back to my original script, but with two modifications:

1. I added a line to rearrange the data to be written:
data_combined=permute(data_combined,[3 2 1]);

2. I modified the hyperslab setup similar to your script:

start = [0 0 0];

stride = [1 1 1];

count= [1 1 1];


block = [48 100 2];


These modifications resulted in the dataset that I was attempting to obtain with two columns, 100 rows, and 96 pages (arrays) long. Here is the modified code:

%----------------------------------------------------------------------------------------
% Initialize Data
%----------------------------------------------------------------------------------------

testdata = single(ones(48,100,2));
testdata2=testdata.*2;

data_initialize =zeros(size(testdata2));
data_combined=[ data_initialize; testdata2];

data_combined=permute(data_combined,[3 2 1]);

filename = 'test3dim10.h5'
dsetname = 'my_dataset'

dims(1) = 48;
dims(2) = 100;
dims(3) = 2;

newdims(1) = 96;
newdims(2) = 100;
newdims(3) = 2;

chunk(1) = 48;
chunk(2) = 100;
chunk(3) = 2;

%----------------------------------------------------------------------------------------
% Create Initial HDF5 File
%----------------------------------------------------------------------------------------


%
% Create a new file using the default properties.
%
fileID = H5F.create(filename, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', 'H5P_DEFAULT');
%
% Create dataspace with unlimited dimensions.
%
maxdims = {'H5S_UNLIMITED', 'H5S_UNLIMITED', 'H5S_UNLIMITED'};

space = H5S.create_simple (3, dims, maxdims);

%
% Create the dataset creation property list, add the gzip
% compression filter and set the chunk size.
%
dcpl = H5P.create('H5P_DATASET_CREATE');
H5P.set_deflate(dcpl, 9);

H5P.set_chunk(dcpl, chunk);

%
% Create the compressed unlimited dataset.
%
datasetID = H5D.create(fileID, dsetname, 'H5T_NATIVE_FLOAT', space, dcpl);
%
% Write the data to the dataset.
%
datatypeID = H5T.copy('H5T_NATIVE_FLOAT');
H5D.write(datasetID, datatypeID,'H5S_ALL', 'H5S_ALL','H5P_DEFAULT', testdata);

%
% Close and release resources.
%
H5P.close(dcpl);
H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);

% ====================New HDF5 File Created ================


%----------------------------------------------------------------------------------------
% Open Existing HDF5 File and Append Data to Dataset
%----------------------------------------------------------------------------------------

% Open Existing HDF5 File
fileID = H5F.open(filename, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
% Open Existing Dataset
datasetID = H5D.open(fileID, dsetname);

% Extend Existing Dataset Dimensions

H5D.extend(datasetID, newdims);
space = H5D.get_space(datasetID);

% Setup Hyperslab
start = [0 0 0];

stride = [1 1 1];

count= [1 1 1];


block = [48 100 2];

H5S.select_hyperslab(space, 'H5S_SELECT_NOTB', start, stride, count, block);

% Write Data to newly extended dimensions


H5D.write(datasetID, 'H5T_NATIVE_FLOAT', 'H5S_ALL', space,'H5P_DEFAULT', data_combined);

H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);

Ken

unread,
Jun 15, 2010, 2:03:20 PM6/15/10
to
OK-- I am getting very close.

By moving my "H5D.get_space" call up one line so that it occurs prior to extending the dimensions, I am able to append to the dataset without the "dummy arrays" requirement.

Here is my latest code:

%----------------------------------------------------------------------------------------
% Initialize Data
%----------------------------------------------------------------------------------------

testdata = single(ones(48,100,2));
testdata2=testdata.*2;

data_initialize =zeros(size(testdata2));
data_combined=[ data_initialize; testdata2];
data_combined=permute(data_combined,[3 2 1]);

data2_perm=permute(testdata2,[3 2 1]);

filename = 'test3dim_newmod.h5'
dsetname = 'my_dataset'

H5P.set_chunk(dcpl, chunk);

% ===============New HDF5 File Created ====================


%----------------------------------------------------------------------------------------
% Open Existing HDF5 File and Append Data to Dataset
%----------------------------------------------------------------------------------------

% Open Existing HDF5 File
fileID = H5F.open(filename, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
% Open Existing Dataset
datasetID = H5D.open(fileID, dsetname);

% Get Data Space and Extend Existing Dataset Dimensions
space = H5D.get_space(datasetID);
H5D.extend(datasetID, newdims);

%
% Setup Hyperslab ---- See: http://www.hdfgroup.org/HDF5/doc1.6/RM_H5S.html#Dataspace-SelectHyperslab for more info.
start = [47 0 0];


stride = [1 1 1];

count = [1 1 1];
block = [1 100 2];

H5S.select_hyperslab(space, 'H5S_SELECT_AND', start, stride, count, block);

% Write Data to newly extended dimensions

H5D.write(datasetID, 'H5T_NATIVE_FLOAT', 'H5S_ALL', space,'H5P_DEFAULT', data2_perm);

H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);
% ===================New HDF5 File Appended =================

The only problem I am having now is that I am only one (1) array away from having the exact results that I want. Note that start=[47 0 0];. I need this to be start=[48 0 0];, but when I try that I get the following error:
___________________________________________________________________
??? Error using ==> H5ML.hdf5 at 25
The HDF5 library encountered an error:

Error in ==> H5D.write at 24
H5ML.hdf5('H5Dwrite', dataset_id, mem_type_id, mem_space_id, file_space_id, plist_id, buf);

Error in ==> test18 at 98
H5D.write(datasetID, 'H5T_NATIVE_FLOAT', 'H5S_ALL', space,'H5P_DEFAULT', data2_perm);
____________________________________________________________________

Any suggestions would be greatly appreciated.

Thanks!

Dinesh Iyer

unread,
Jun 15, 2010, 2:46:04 PM6/15/10
to
Hello Ken,
I have written a simple code that shows how you can append data to dataset after extending the dimensions. You can modify this code to suit your needs.

Dinesh

%%
clc
clear all

% The objective of this exercise is to write data into an dataset that has
% unlimited dimensions. We will create a dataset that has an initial size
% of 10-by-20. It will be then extended upto 10-by-25 by adding a 10-by-5
% hyperslab of data

% Create the HDF5 file
fileName = 'myFile.h5';
fcpl_id = H5P.create('H5P_FILE_CREATE');
fapl_id = H5P.create('H5P_FILE_ACCESS');

fid = H5F.create(fileName, 'H5F_ACC_TRUNC', fcpl_id, fapl_id);

% Create the Space for the Dataset
initDims = [10 20];
h5_initDims = fliplr(initDims);
maxDims = [10 -1];
h5_maxDims = fliplr(maxDims);
space_id = H5S.create_simple(2, h5_initDims, h5_maxDims);

% Create the Dataset
dsetName = 'myDataset';
dcpl_id = H5P.create('H5P_DATASET_CREATE');
chunkSize = [10 1];
h5_chunkSize = fliplr(chunkSize);
H5P.set_chunk(dcpl_id, h5_chunkSize);

dsetType_id = H5T.copy('H5T_NATIVE_DOUBLE');

dset_id = H5D.create(fid, dsetName, dsetType_id, space_id, dcpl_id);

% Initial Data to Write
rowDim = initDims(1); colDim = initDims(2);
initDataToWrite = reshape( (0:rowDim*colDim-1), rowDim, colDim );

% Write the initial data
H5D.write(dset_id, 'H5ML_DEFAULT', 'H5S_ALL', 'H5S_ALL', 'H5P_DEFAULT', initDataToWrite);

% Close the open Identifiers
H5S.close(space_id);
H5D.close(dset_id);
H5F.close(fid);

%%
% Open the Dataset and append data to the unlimited dimension which in this
% case is the seond dimension as seen from MATLAB.
fid = H5F.open(fileName, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
dset_id = H5D.open(fid, dsetName);

% Create the data to be appended
dimsOfData = [ 10 5 ];
h5_dimsOfData = fliplr(dimsOfData);

% Get the Dataspace of the Dataset to be appended
space_id = H5D.get_space(dset_id);

[~, h5_currDims] = H5S.get_simple_extent_dims(space_id);
currDims = fliplr(h5_currDims);

% Update the extend of the Dataspace to match the data to be appended
newDims = currDims;
newDims(2) = currDims(2) + dimsOfData(2);
h5_newDims = fliplr(newDims);

H5D.set_extent(dset_id, h5_newDims);

% Data to append
rowDim = dimsOfData(1); colDim = dimsOfData(2);
dataToWrite = 10*reshape( (0:rowDim*colDim-1), rowDim, colDim );

% Update the File Space ID such that only the appended data is written.
H5S.close(space_id);
space_id = H5D.get_space(dset_id);

% Define the hyperslab selection
start = [0 currDims(2)]; h5_start = fliplr(start);
stride = [1 1]; h5_stride = fliplr(stride);
count = [1 1]; h5_count = fliplr(count);
block = dimsOfData; h5_block = fliplr(block);

H5S.select_hyperslab(space_id, 'H5S_SELECT_SET', h5_start, h5_stride, h5_count, h5_block);

% Write the Data
memSpace_id = H5S.create_simple(2, h5_dimsOfData, []);
H5D.write(dset_id, 'H5ML_DEFAULT', memSpace_id, space_id, 'H5P_DEFAULT', dataToWrite);

% Close the open identifiers
H5S.close(memSpace_id);
H5S.close(space_id);
H5D.close(dset_id);
H5F.close(fid);

Ken

unread,
Jun 15, 2010, 3:49:06 PM6/15/10
to
Thank you Dinesh! You helped me solve my dilemma!

The memSpace_id did the trick.

For those interested, here is the final code that writes the file and dataset with the correct dimensions that I was after:

%----------------------------------------------------------------------------------------
% Initialize Data
%----------------------------------------------------------------------------------------

testdata = single(ones(48,100,2));
testdata2=testdata.*2;

data_initialize =zeros(size(testdata2));
data_combined=[ data_initialize; testdata2];
data_combined=permute(data_combined,[3 2 1]);
data2_perm=permute(testdata2,[3 2 1]);

filename = 'test3dim_works.h5'
dsetname = 'my_dataset'

H5P.set_chunk(dcpl, chunk);

% =============New HDF5 File Created ===================


%----------------------------------------------------------------------------------------
% Open Existing HDF5 File and Append Data to Dataset
%----------------------------------------------------------------------------------------

% Open Existing HDF5 File
fileID = H5F.open(filename, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
% Open Existing Dataset
datasetID = H5D.open(fileID, dsetname);

% Get Data Space and Extend Existing Dataset Dimensions

H5D.extend(datasetID, newdims);
space = H5D.get_space(datasetID);

%
% Setup Hyperslab ----
% See:
% http://www.hdfgroup.org/HDF5/doc1.6/RM_H5S.html#Dataspace-SelectHyperslab
% for more info.
%
start = [48 0 0 ];

stride = [1 1 1];
count = [1 1 1];

block = [48 100 2];

H5S.select_hyperslab(space, 'H5S_SELECT_SET', start, stride, count, block);

% Write Data to newly extended dimensions

memspaceID = H5S.create_simple(3, block, []);
H5D.write(datasetID, 'H5T_NATIVE_FLOAT', memspaceID, space,'H5P_DEFAULT', data2_perm);

H5S.close(memspaceID);
H5S.close(space);
H5D.close(datasetID);
H5F.close(fileID);

% ===============New HDF5 File Appended ====================


Thanks again!

Markus Krug

unread,
Aug 8, 2016, 9:33:08 AM8/8/16
to
Hi Dinesh,

I'm looking for something similar. However I need two extensions.
1.) I want to write compound datasets that gets extended by one row each time I write to the file. So actually I'm writing to a table with a headline like 'Time', 'SensorValue 1', SensorValue2', ... . Each time I'm writing to this table a new set of sensor values with their time stamp are written to the hdf file.
2.) I want to write the data with the compressed format (filter).

Can you extend you example to this?

Best Regards
Markus


"Dinesh Iyer" wrote in message <hv8hpc$hau$1...@fred.mathworks.com>...
0 new messages