Skip to content

Commit 25aef6d

Browse files
Major zenodo and serialization enhancements (#28)
* improve zenodo and serialization (#27) - fix issues with ZenodoRecord - add support of Literals in RDF attributes, e.g. h5.frdf["description"].object = rdflib.Literal("An english description", "en") - removed deprecated methods in `ZenodoSandboxDeposit` - add support of Literals in RDF attributes, e.g. `h5.frdf["description"].object = rdflib.Literal("An english description", "en")` - improve serializing HDF5 contextual and structural metadata to RDF-based formats - minor bugfixes * fix docs and datetime serialization * update readme and CITATION.cff
1 parent 37a4861 commit 25aef6d

37 files changed

+1185
-993
lines changed

CHANGELOG.md

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,18 @@
22

33
Log of changes in the versions
44

5+
## v2.4.0-rc.2
6+
7+
- fix issues with ZenodoRecord
8+
- add support of Literals in RDF attributes, e.g. h5.frdf["description"].object = rdflib.Literal("An english
9+
description", "en")
10+
- removed deprecated methods in `ZenodoSandboxDeposit`
11+
- add support of Literals in RDF attributes, e.g.
12+
`h5.frdf["description"].object = rdflib.Literal("An english description", "en")`
13+
- improve serializing HDF5 contextual and structural metadata to RDF-based formats
14+
- minor bugfixes
15+
- default dtime format used within h5rdmtoolbox is now ISO 8601 ('%Y-%m-%dT%H:%M:%S%f')
16+
517
## v2.4.0-rc.1
618

719
- allow numpy 2.x versions
@@ -15,7 +27,7 @@ Log of changes in the versions
1527

1628
## v2.3.1
1729

18-
- fixing error in parsing obj name. "/" is safed and will not be converted anymore
30+
- fixing error in parsing obj name. "/" is saved and will not be converted anymore
1931

2032
## v2.3.0
2133

CITATION.cff

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ authors:
1111
given-names: "Lucas"
1212
orcid: "https://orcid.org/0000-0002-4116-0065"
1313
title: "h5rdmtoolbox - HDF5 Research Data Management Toolbox"
14-
version: 2.4.0-rc.1
15-
doi: 10.5281/zenodo.17088483
14+
version: 2.4.0-rc.2
15+
doi: 10.5281/zenodo.17334652
1616
date-released: 2025-10-12
1717
url: "https://github.com/matthiasprobst/h5rdmtoolbox"

README.md

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -94,14 +94,11 @@ A paper is published in the journal [inggrid](https://preprints.inggrid.org/repo
9494

9595
## Installation
9696

97-
Use python 3.8 or higher (automatic testing is performed until 3.12). If you are a regular user, you can install the
97+
Use python 3.9 or higher (automatic testing is performed until 3.13). If you are a regular user, you can install the
9898
package via pip:
9999

100100
pip install h5RDMtoolbox
101101

102-
or if you prefer `uv`:
103-
104-
uv install h5RDMtoolbox
105102

106103
### Install from source:
107104

@@ -176,10 +173,12 @@ Install optional dependencies by specifying them in square brackets after the pa
176173
## Citing the package
177174

178175
If you intend to use the package in your work, you may cite the software itself as published on paper in the
179-
[Zenodo](https://zenodo.org/records/14473697) repository. A related paper is published in the
176+
[Zenodo (latest version)](https://zenodo.org/records/13309253) repository. A related paper is published in the
180177
journal [inggrid](https://www.inggrid.org/article/id/4028/). Thank you!
181178

182-
Here's the bibtext to it:
179+
Alternatively or additionally, you can consult the `CITATION.cff` file.
180+
181+
Here is the BibTeX entry:
183182

184183
```
185184
@article{probst2024h5rdmtoolbox,

codemeta.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
"license": "https://spdx.org/licenses/MIT",
55
"codeRepository": "git+https://github.com/matthiasprobst/h5RDMtoolbox.git",
66
"name": "h5RDMtoolbox",
7-
"version": "2.4.0-rc.1",
7+
"version": "2.4.0-rc.2",
88
"description": "Supporting a FAIR Research Data lifecycle using Python and HDF5.",
99
"applicationCategory": "Engineering",
1010
"programmingLanguage": [

docs/colab/quickstart.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
"scrolled": true
88
},
99
"outputs": [],
10-
"source": "# !pip install h5rdmtoolbox==2.4.0rc1"
10+
"source": "# !pip install h5rdmtoolbox==2.4.0rc2"
1111
},
1212
{
1313
"cell_type": "code",

docs/gettingstarted/quickoverview.ipynb

Lines changed: 93 additions & 64 deletions
Large diffs are not rendered by default.

docs/practical_examples/knowledge_graph.ipynb

Lines changed: 16 additions & 156 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
},
1313
{
1414
"cell_type": "code",
15-
"execution_count": 1,
15+
"execution_count": null,
1616
"id": "a9b58411-88af-4972-b5c8-b01391df58fc",
1717
"metadata": {
1818
"scrolled": true
@@ -24,7 +24,7 @@
2424
},
2525
{
2626
"cell_type": "code",
27-
"execution_count": 2,
27+
"execution_count": null,
2828
"id": "feb70a18-b9e7-40db-b369-4e6c927301e5",
2929
"metadata": {},
3030
"outputs": [],
@@ -36,7 +36,7 @@
3636
},
3737
{
3838
"cell_type": "code",
39-
"execution_count": 3,
39+
"execution_count": null,
4040
"id": "690c0219-d7f5-4b0d-a897-59cf8f354149",
4141
"metadata": {},
4242
"outputs": [],
@@ -46,109 +46,12 @@
4646
},
4747
{
4848
"cell_type": "code",
49-
"execution_count": 4,
49+
"execution_count": null,
5050
"id": "d4d871e0-5bff-42f7-a27d-f1f0337dd964",
5151
"metadata": {
5252
"scrolled": true
5353
},
54-
"outputs": [
55-
{
56-
"name": "stdout",
57-
"output_type": "stream",
58-
"text": [
59-
"@prefix dcterms: <http://purl.org/dc/terms/> .\n",
60-
"@prefix ex: <https://example.org#> .\n",
61-
"@prefix foaf: <http://xmlns.com/foaf/0.1/> .\n",
62-
"@prefix hdf: <http://purl.allotrope.org/ontologies/hdf5/1.8#> .\n",
63-
"@prefix prov: <http://www.w3.org/ns/prov#> .\n",
64-
"@prefix xsd: <http://www.w3.org/2001/XMLSchema#> .\n",
65-
"\n",
66-
"ex:tmp0.hdf a hdf:File ;\n",
67-
" hdf:rootGroup <https://example.org#tmp0.hdf/> .\n",
68-
"\n",
69-
"<https://example.org#tmp0.hdf/> a hdf:Group ;\n",
70-
" hdf:member <https://example.org#tmp0.hdf/contact>,\n",
71-
" <https://example.org#tmp0.hdf/nd-array>,\n",
72-
" <https://example.org#tmp0.hdf/test> ;\n",
73-
" hdf:name \"/\"^^xsd:string .\n",
74-
"\n",
75-
"<https://example.org#tmp0.hdf/contact> a hdf:Group ;\n",
76-
" hdf:attribute <https://example.org#tmp0.hdf/contact@fname>,\n",
77-
" <https://example.org#tmp0.hdf/contact@hint>,\n",
78-
" <https://example.org#tmp0.hdf/contact@lname> ;\n",
79-
" hdf:name \"/contact\"^^xsd:string ;\n",
80-
" dcterms:relation <https://orcid.org/0000-0001-8729-0482> .\n",
81-
"\n",
82-
"<https://example.org#tmp0.hdf/contact@fname> a hdf:StringAttribute ;\n",
83-
" hdf:data \"Matthias\"^^xsd:string ;\n",
84-
" hdf:name \"fname\" .\n",
85-
"\n",
86-
"<https://example.org#tmp0.hdf/contact@hint> a hdf:StringAttribute ;\n",
87-
" hdf:data \"This group could be representing a person.\"^^xsd:string ;\n",
88-
" hdf:name \"hint\" .\n",
89-
"\n",
90-
"<https://example.org#tmp0.hdf/contact@lname> a hdf:StringAttribute ;\n",
91-
" hdf:data \"Probst\"^^xsd:string ;\n",
92-
" hdf:name \"lname\" .\n",
93-
"\n",
94-
"<https://example.org#tmp0.hdf/nd-array> a hdf:Dataset ;\n",
95-
" hdf:chunk <https://example.org#tmp0.hdf/nd-array__chunk_dimensions> ;\n",
96-
" hdf:dataspace <https://example.org#tmp0.hdf/nd-array__dataspace> ;\n",
97-
" hdf:datatype hdf:H5T_IEEE_F64LE,\n",
98-
" \"H5T_FLOAT\" ;\n",
99-
" hdf:layout hdf:H5D_CHUNKED ;\n",
100-
" hdf:maximumSize 6 ;\n",
101-
" hdf:name \"/nd-array\" ;\n",
102-
" hdf:rank 2 ;\n",
103-
" hdf:size 6 .\n",
104-
"\n",
105-
"<https://example.org#tmp0.hdf/nd-array__chunk_dimension_0> a hdf:DataspaceDimension ;\n",
106-
" hdf:dimensionIndex 0 ;\n",
107-
" hdf:size 1 .\n",
108-
"\n",
109-
"<https://example.org#tmp0.hdf/nd-array__chunk_dimension_1> a hdf:DataspaceDimension ;\n",
110-
" hdf:dimensionIndex 1 ;\n",
111-
" hdf:size 3 .\n",
112-
"\n",
113-
"<https://example.org#tmp0.hdf/nd-array__chunk_dimensions> a hdf:ChunkDimension ;\n",
114-
" hdf:dimension <https://example.org#tmp0.hdf/nd-array__chunk_dimension_0>,\n",
115-
" <https://example.org#tmp0.hdf/nd-array__chunk_dimension_1> .\n",
116-
"\n",
117-
"<https://example.org#tmp0.hdf/nd-array__dataspace> a hdf:SimpleDataspace ;\n",
118-
" hdf:dimension <https://example.org#tmp0.hdf/nd-array__dataspace_dimension_0>,\n",
119-
" <https://example.org#tmp0.hdf/nd-array__dataspace_dimension_1> .\n",
120-
"\n",
121-
"<https://example.org#tmp0.hdf/nd-array__dataspace_dimension_0> a hdf:DataspaceDimension ;\n",
122-
" hdf:dimensionIndex 0 ;\n",
123-
" hdf:size 2 .\n",
124-
"\n",
125-
"<https://example.org#tmp0.hdf/nd-array__dataspace_dimension_1> a hdf:DataspaceDimension ;\n",
126-
" hdf:dimensionIndex 1 ;\n",
127-
" hdf:size 3 .\n",
128-
"\n",
129-
"<https://example.org#tmp0.hdf/test> a hdf:Dataset ;\n",
130-
" hdf:dataspace <https://example.org#tmp0.hdf/test__dataspace> ;\n",
131-
" hdf:datatype hdf:H5T_IEEE_F64LE,\n",
132-
" \"H5T_FLOAT\" ;\n",
133-
" hdf:layout hdf:H5D_CONTIGUOUS ;\n",
134-
" hdf:maximumSize -1 ;\n",
135-
" hdf:name \"/test\" ;\n",
136-
" hdf:rank 0 ;\n",
137-
" hdf:size 1 ;\n",
138-
" hdf:value 4.3e+00 .\n",
139-
"\n",
140-
"<https://example.org#tmp0.hdf/test__dataspace> a hdf:ScalarDataspace .\n",
141-
"\n",
142-
"<https://orcid.org/0000-0001-8729-0482> a prov:Person ;\n",
143-
" foaf:firstName \"Matthias\"^^xsd:string ;\n",
144-
" foaf:lastName \"Probst\"^^xsd:string .\n",
145-
"\n",
146-
"hdf:H5T_IEEE_F64LE a hdf:Datatype .\n",
147-
"\n",
148-
"\n"
149-
]
150-
}
151-
],
54+
"outputs": [],
15255
"source": [
15356
"with h5tbx.File() as h5:\n",
15457
" h5.create_dataset(name='test', data=4.3)\n",
@@ -167,7 +70,7 @@
16770
},
16871
{
16972
"cell_type": "code",
170-
"execution_count": 5,
73+
"execution_count": null,
17174
"id": "064a9e31-1fd0-429e-9189-663e950bc52f",
17275
"metadata": {},
17376
"outputs": [],
@@ -180,7 +83,7 @@
18083
},
18184
{
18285
"cell_type": "code",
183-
"execution_count": 6,
86+
"execution_count": null,
18487
"id": "ce1a26f6-ad59-4014-b5b2-dcee38f11a26",
18588
"metadata": {},
18689
"outputs": [],
@@ -190,7 +93,7 @@
19093
},
19194
{
19295
"cell_type": "code",
193-
"execution_count": 7,
96+
"execution_count": null,
19497
"id": "78c9adf7-43f0-4aa1-8740-149bf3a5f74f",
19598
"metadata": {},
19699
"outputs": [],
@@ -203,18 +106,10 @@
203106
},
204107
{
205108
"cell_type": "code",
206-
"execution_count": 8,
109+
"execution_count": null,
207110
"id": "e8d13d2e-d66e-4028-8909-7192cfef1bf6",
208111
"metadata": {},
209-
"outputs": [
210-
{
211-
"name": "stdout",
212-
"output_type": "stream",
213-
"text": [
214-
"Warning: When cdn_resources is 'local' jupyter notebook has issues displaying graphics on chrome/safari. Use cdn_resources='in_line' or cdn_resources='remote' if you have issues viewing graphics in a notebook.\n"
215-
]
216-
}
217-
],
112+
"outputs": [],
218113
"source": [
219114
"pyvis_graph = subgraph.build_pyvis_graph(notebook=True, style=VIS_STYLE)\n",
220115
"# pyvis_graph.show('graph.html', notebook=True)\n",
@@ -232,37 +127,10 @@
232127
},
233128
{
234129
"cell_type": "code",
235-
"execution_count": 9,
130+
"execution_count": null,
236131
"id": "a44f0039-00b8-44c5-b237-ccba12993bdd",
237132
"metadata": {},
238-
"outputs": [
239-
{
240-
"name": "stdout",
241-
"output_type": "stream",
242-
"text": [
243-
"{\n",
244-
" \"@context\": {\n",
245-
" \"dcterms\": \"http://purl.org/dc/terms/\",\n",
246-
" \"foaf\": \"http://xmlns.com/foaf/0.1/\",\n",
247-
" \"prov\": \"http://www.w3.org/ns/prov#\"\n",
248-
" },\n",
249-
" \"@graph\": [\n",
250-
" {\n",
251-
" \"@id\": \"prov:Person\",\n",
252-
" \"foaf:firstName\": \"Matthias\",\n",
253-
" \"foaf:lastName\": \"Probst\"\n",
254-
" },\n",
255-
" {\n",
256-
" \"@id\": \"https://example.org#tmp1.hdf/contact\",\n",
257-
" \"dcterms:relation\": {\n",
258-
" \"@id\": \"prov:Person\"\n",
259-
" }\n",
260-
" }\n",
261-
" ]\n",
262-
"}\n"
263-
]
264-
}
265-
],
133+
"outputs": [],
266134
"source": [
267135
"with h5tbx.File() as h5:\n",
268136
" h5.create_dataset(name='test', data=4.3)\n",
@@ -280,7 +148,7 @@
280148
},
281149
{
282150
"cell_type": "code",
283-
"execution_count": 10,
151+
"execution_count": null,
284152
"id": "077911e8-8bc3-4b58-a480-df605bdceae4",
285153
"metadata": {},
286154
"outputs": [],
@@ -295,18 +163,10 @@
295163
},
296164
{
297165
"cell_type": "code",
298-
"execution_count": 11,
166+
"execution_count": null,
299167
"id": "67f99459-8351-4819-8a67-2b58d4b1e398",
300168
"metadata": {},
301-
"outputs": [
302-
{
303-
"name": "stdout",
304-
"output_type": "stream",
305-
"text": [
306-
"Warning: When cdn_resources is 'local' jupyter notebook has issues displaying graphics on chrome/safari. Use cdn_resources='in_line' or cdn_resources='remote' if you have issues viewing graphics in a notebook.\n"
307-
]
308-
}
309-
],
169+
"outputs": [],
310170
"source": [
311171
"kg_from_jsonld = kglab.KnowledgeGraph().load_rdf_text(\n",
312172
" graph.serialize(format=\"ttl\")\n",
@@ -353,7 +213,7 @@
353213
"name": "python",
354214
"nbconvert_exporter": "python",
355215
"pygments_lexer": "ipython3",
356-
"version": "3.10.18"
216+
"version": "3.9.23"
357217
}
358218
},
359219
"nbformat": 4,

0 commit comments

Comments
 (0)